model,5-5-N03-2036,ak </term> . We show experimental results on <term> block selection criteria </term> based on <term> unigram counts </term>
model,6-3-N03-2036,ak During <term> decoding </term> , we use a <term> block unigram model </term> and a <term> word-based trigram language
other,5-2-N03-2036,ak <term> units of translation </term> are <term> blocks </term> - pairs of <term> phrases </term> . During
other,4-4-N03-2036,ak . During <term> training </term> , the <term> blocks </term> are learned from <term> source interval
tech,1-3-N03-2036,ak pairs of <term> phrases </term> . During <term> decoding </term> , we use a <term> block unigram model
other,21-1-N03-2036,ak </term> that uses a much simpler set of <term> model parameters </term> than similar <term> phrase-based models
other,13-5-N03-2036,ak based on <term> unigram counts </term> and <term> phrase length </term> . In this paper , we propose a novel
model,25-1-N03-2036,ak model parameters </term> than similar <term> phrase-based models </term> . The <term> units of translation </term>
model,7-1-N03-2036,ak </term> . In this paper , we describe a <term> phrase-based unigram model </term> for <term> statistical machine translation
other,9-2-N03-2036,ak </term> are <term> blocks </term> - pairs of <term> phrases </term> . During <term> decoding </term> , we
other,8-4-N03-2036,ak <term> blocks </term> are learned from <term> source interval projections </term> using an underlying <term> word alignment
tech,11-1-N03-2036,ak phrase-based unigram model </term> for <term> statistical machine translation </term> that uses a much simpler set of <term>
tech,1-4-N03-2036,ak trigram language model </term> . During <term> training </term> , the <term> blocks </term> are learned
other,10-5-N03-2036,ak selection criteria </term> based on <term> unigram counts </term> and <term> phrase length </term> . In
other,1-2-N03-2036,ak <term> phrase-based models </term> . The <term> units of translation </term> are <term> blocks </term> - pairs of <term>
model,14-4-N03-2036,ak projections </term> using an underlying <term> word alignment </term> . We show experimental results on
model,11-3-N03-2036,ak <term> block unigram model </term> and a <term> word-based trigram language model </term> . During <term> training </term> , the
hide detail