Publication
ASRU 2003
Conference paper

Forward-backward modeling in statistical natural concept generation for interlingua-based speech-to-speech translation

View publication

Abstract

Natural concept generation is critical to statistical interlingua-based speech-to-speech translation performance. To improve maximum-entropy-based concept generation, a forward-backward modeling approach is proposed, which generates concept sequences in the target language by selecting the hypothesis with the highest combined conditional probability based on both the forward and backward generation models. Statistical language models are further applied to utilize word-level context information. The concept generation error rate is reduced by over 20% in our speech translation corpus within limited domains. Improvements are also achieved in our experiments on speech translation.

Date

Publication

ASRU 2003

Authors

Topics

Share