Skip to main content

A hierarchical language model based on variable-length class sequences: The MC eta nu approach

01 March 2002

New Image

In this paper, we propose a new language model which represents long-term dependencies between word sequences using a multilevel hierarchy. We call this model MCnv where n is the maximum number of words in a sequence and v is the maximum number of levels. The originality of this model, which is an extension of the multigrams, is its ability to take into account long distance dependencies according to dependent variable-length sequences. In order to discover the variable-length sequences and to build the hierarchy, we use a set of 233 syntactic classes extracted from eight elementary grammatical classes of French. The MCnv model learns hierarchical word patterns and uses them to reevaluate and filter the n-best utterance hypotheses output by our speech recognizer MAUD. The model has been trained on a corpus of 43 million words extracted from the French newspaper ``Le Monde{''} and uses a vocabulary of 20 000 words. Tests have been conducted on 300 sentences. Compared to the class trigram and the baseline multigrams approach, we report a perplexity reduction of 17% and 20%, respectively. Rescoring the original n-best hypotheses resulted in an improvement of the word error rate: 7% and 2% compared to the class trigram and multigrams, respectively.