N07-1007 significantly outperformed standard n-best re-ranking . This method should be generally
N07-1007 also proposed an extension of n-best re-ranking which significantly outperformed
N07-1007 straightforward application of the n-best re-ranking approach described in Och et
P09-1066 hypothesis re-ranking , and f for n-best re-ranking only . For partial hypothesis
D14-1094 Viterbi search approach outperforms N-best re-ranking approach . The optimal value
D14-1094 reaches higher values compared to N-best re-ranking approach . Notice that , when
D08-1076 used to boost the BLEU score on N-best re-ranking tasks . The incorporation of
D13-1107 2010 ) proposed using MTL for N-best re-ranking on sparse feature sets , where
D14-1094 surpasses 3 . This is because , unlike N-best re-ranking approach , modified Viterbi search
D14-1094 implementing Equation 1 . 1 . N-best re-ranking : N-best sequences of spacing
P08-2010 quickly boost the BLEU score on N-best re-ranking tasks . In this paper , we describe
W06-1626 . This limits the potential of N-best re-ranking . Spurious ambiguity is created
N07-1007 turn . 4.1 Method 1 : Standard n-best re-ranking This method is a straightforward
N07-1007 experimentation . Within the space of n-best re-ranking , we have considered two variations
N07-1007 information as features in a standard n-best re-ranking scenario does not lead to an
N07-1007 substantially outperforms standard n-best re-ranking . Our best performing model achieves
P12-1032 that in graph construction for n-best re-ranking , except that two nodes are always
P08-2010 Minimum Error Rate Training for N-best Re-ranking </title> Katrin Kirchhoff Abstract
D13-1013 using minimal expected F - loss in n-best re-ranking . Georgila ( 2009 ) uses integer
N13-1048 ' ) . The hinge loss under the N-best re-ranking framework is defined as max (
hide detail