P00-1073 distribution-based pruning of n-gram backoff language models . Instead of
P99-1002 cost The speech recognizer uses n-gram backoff language models estimated on
P00-1073 proposed a novel approach for n-gram backoff models pruning : keep n-grams
P04-1008 transducer Gˆ represents an n-gram backoff model for the joint probability
P10-1046 by just under 1 % . The Google n-gram backoff model is almost as good as backing
P00-1073 method results in a more general n-gram backoff model , which resists to domain
W10-3602 occur only once are pruned , and n-gram backoff weights are re-normalized after
D08-1087 in this work can be encoded as n-gram backoff models , they are applied directly
N12-1014 Modeling a corpus with n-gram counts n-gram backoff language models have been used
P00-1073 method results in a more general n-gram backoff model , in spite of the domain
W05-1104 StandardNgramModel class can load standard n-gram backoff models for scoring , as shown
P12-1063 ngram models ; first a classical n-gram backoff model ( Chen and Goodman , 1999
hide detail