About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
INTERSPEECH - Eurospeech 2005
Conference paper
Use of maximum entropy in natural word generation for statistical concept-based speech-to-speech translation
Abstract
Our statistical concept-based spoken language translation method consists of three cascaded components: natural language understanding, natural concept generation and natural word generation. In the previous approaches, statistical models are used only in the first two components. In this paper, a novel maximum-entropy-based statistical natural word generation algorithm is proposed that takes into account both the word level and concept level context information in the source and the target language. A recursive generation scheme is further devised to integrate this statistical generation algorithm with the previously proposed maximum-entropy-based natural concept generation algorithm. The translation error rate is reduced by 14%-20% in our speech-to-speech translation experiments.