model,11-3-N03-2036,bq <term> block unigram model </term> and a <term> word-based trigram language model </term> . During <term> training </term> , the
model,14-4-N03-2036,bq projections </term> using an underlying <term> word alignment </term> . We show experimental results on
model,25-1-N03-2036,bq model parameters </term> than similar <term> phrase-based models </term> . The <term> units of translation </term>
model,6-3-N03-2036,bq During <term> decoding </term> , we use a <term> block unigram model </term> and a <term> word-based trigram language
model,7-1-N03-2036,bq </term> . In this paper , we describe a <term> phrase-based unigram model </term> for <term> statistical machine translation
other,1-2-N03-2036,bq <term> phrase-based models </term> . The <term> units of translation </term> are <term> blocks </term> - pairs of <term>
other,10-5-N03-2036,bq selection criteria </term> based on <term> unigram </term> counts and <term> phrase </term> length
other,13-5-N03-2036,bq based on <term> unigram </term> counts and <term> phrase </term> length . In this paper , we propose
other,21-1-N03-2036,bq </term> that uses a much simpler set of <term> model parameters </term> than similar <term> phrase-based models
other,4-4-N03-2036,bq . During <term> training </term> , the <term> blocks </term> are learned from <term> source interval
other,5-2-N03-2036,bq <term> units of translation </term> are <term> blocks </term> - pairs of <term> phrases </term> . During
other,5-5-N03-2036,bq </term> . We show experimental results on <term> block selection criteria </term> based on <term> unigram </term> counts
other,8-4-N03-2036,bq <term> blocks </term> are learned from <term> source interval projections </term> using an underlying <term> word alignment
other,9-2-N03-2036,bq </term> are <term> blocks </term> - pairs of <term> phrases </term> . During <term> decoding </term> , we
tech,1-3-N03-2036,bq pairs of <term> phrases </term> . During <term> decoding </term> , we use a <term> block unigram model
tech,1-4-N03-2036,bq trigram language model </term> . During <term> training </term> , the <term> blocks </term> are learned
tech,11-1-N03-2036,bq phrase-based unigram model </term> for <term> statistical machine translation </term> that uses a much simpler set of <term>
hide detail