D09-1105 could also be integrated into a phrase-based decoder . We briefly sketch that possibility
D14-1132 - best lists generated by the phrase-based decoder to train the discriminative reordering
D14-1003 * English system is a standard phrase-based decoder trained on 6.6 M segments , using
D14-1060 used an in-house hierarchical phrase-based decoder to verify our proposed models
D13-1140 k-best output of a hierarchical phrase-based decoder ( Chiang , 2007 ) . Second ,
D09-1023 leftto-right in the lattice , the way phrase-based decoders operate . Since these models
D09-1105 sketch that possibility here . Phrase-based decoders keep a source coverage vector
D08-1060 similar to the blocks used in phrase-based decoder , to further improve the rules
D09-1037 the state-of-the-art : the Moses phrase-based decoder ( Koehn et al. , 2007 ) achieves
D09-1105 better fit for integration with a phrase-based decoder . Finally , we gave a novel algorithm
D10-1027 Dynamic Programming Algorithm Phrase-based decoders generate partial targetlanguage
D10-1091 decoder and the ones found by the phrase-based decoder , causes of failure can be easily
D09-1114 SYS2 ) is a reimplementation of a phrase-based decoder with lexicalized reordering model
D13-1163 2012 ) propose a document - wide phrase-based decoder and integrate a semantic language
D13-1106 efficiently tackle this issue . Phrase-based decoders operate by maintaining a set
D13-1054 be seamlessly integrated into phrase-based decoders since partial translations grow
D14-1132 regard the 1-best output of the phrase-based decoder with the hierarchical reordering
D08-1041 . Our baseline MT decoder is a phrase-based decoder as described in ( Al-Onaizan
D10-1091 the same ruleset as conventional phrase-based decoders . We also consider an alternative
D13-1140 a phrase-based or hierarchical phrase-based decoder just as a conventional n-gram
hide detail