About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
INTERSPEECH - Eurospeech 2001
Conference paper
Improvement of a structured language model: Arbori-context tree
Abstract
In this paper we present an extention of a context tree for a structured language model (SLM), which we call an arbori-context tree. The state-of-The-Art SLM predicts the next word from a fixed partial tree of the history tree, such as two exposed heads, etc. An arbori-context tree allows us to select an opti-mum partial tree of a history tree for the next word prediction depending on the effectiveness in the similar way that a context tree selects the length of the history (n of n-gram). The experiment we conducted showed that the test set perplexity of the SLM based on an arbori-context tree (79.98) was lower than that of the SLM with a fixed history (101.56).