P06-1126 pruning . Section 3 proposes our discriminative pruning method for Chinese word segmentation
P06-1126 , denoted by " KLD " . For the discriminative pruning criterion , the growing algorithm
P06-1126 . In this paper , we propose a discriminative pruning method of n-gram language model
P06-1126 Experimental results showed that the discriminative pruning method achieves significant improvements
P06-1126 Experimental results show that the discriminative pruning method leads to a much smaller
P06-1126 Abstract This paper presents a discriminative pruning method of n-gram language model
P06-1126 simultaneously , according to a same discriminative pruning crite - rion . And we will try
P06-1126 Figure 2 and Figure 4 are used for discriminative pruning method . Growing algorithms are
P06-1126 In addition , by combining the discriminative pruning method with the baseline method
P06-1126 currently the unigram model . For the discriminative pruning method suggested in this paper
P10-1033 corpus for our research . <title> Discriminative Pruning for Discriminative ITG Alignment
P10-1033 the problem of ITG pruning . A discriminative pruning model and two discriminative
P10-1033 experiments . 7.2 Experiment Data Both discriminative pruning and alignment need training data
P06-1126 achieved the smallest models . 3 Discriminative Pruning for Chinese Word Segmentation
P06-1126 pruning , A is an invariable . The discriminative pruning criterion is inspired by the
P10-1033 alignment for sentence pairs . Discriminative pruning , however , handles not only
P06-1126 bigram is computed in terms of discriminative pruning criterion that is related to
P10-1033 The DPDI Framework DPDI , the discriminative pruning model proposed in this paper
P06-1126 project ( 4410001 ) . <title> Discriminative Pruning of Language Models for Chinese
P06-1126 Conclusions and Future Work A discriminative pruning criterion of n-gram language
hide detail