#3440During decoding, we use a block unigram model and aword-based trigram language model.
model,14-4-N03-2036,ak
projections
</term>
using an underlying
<term>
word alignment
</term>
. We show experimental results on
#3459During training, the blocks are learned from source interval projections using an underlyingword alignment.
model,5-5-N03-2036,ak
</term>
. We show experimental results on
<term>
block selection criteria
</term>
based on
<term>
unigram counts
</term>
#3467We show experimental results onblock selection criteria based on unigram counts and phrase length.
model,6-3-N03-2036,ak
During
<term>
decoding
</term>
, we use a
<term>
block unigram model
</term>
and a
<term>
word-based trigram language
#3435During decoding, we use ablock unigram model and a word-based trigram language model.
model,7-1-N03-2036,ak
</term>
. In this paper , we describe a
<term>
phrase-based unigram model
</term>
for
<term>
statistical machine translation
#3397In this paper, we describe aphrase-based unigram model for statistical machine translation that uses a much simpler set of model parameters than similar phrase-based models.
other,1-2-N03-2036,ak
<term>
phrase-based models
</term>
. The
<term>
units of translation
</term>
are
<term>
blocks
</term>
- pairs of
<term>
#3419Theunits of translation are blocks - pairs of phrases.
other,13-5-N03-2036,ak
based on
<term>
unigram counts
</term>
and
<term>
phrase length
</term>
. In this paper , we propose a novel
#3475We show experimental results on block selection criteria based on unigram counts andphrase length.
other,21-1-N03-2036,ak
</term>
that uses a much simpler set of
<term>
model parameters
</term>
than similar
<term>
phrase-based models
#3411In this paper, we describe a phrase-based unigram model for statistical machine translation that uses a much simpler set ofmodel parameters than similar phrase-based models.
other,4-4-N03-2036,ak
. During
<term>
training
</term>
, the
<term>
blocks
</term>
are learned from
<term>
source interval
#3449During training, theblocks are learned from source interval projections using an underlying word alignment.
other,5-2-N03-2036,ak
<term>
units of translation
</term>
are
<term>
blocks
</term>
- pairs of
<term>
phrases
</term>
. During
#3423The units of translation areblocks - pairs of phrases.
other,8-4-N03-2036,ak
<term>
blocks
</term>
are learned from
<term>
source interval projections
</term>
using an underlying
<term>
word alignment
#3453During training, the blocks are learned fromsource interval projections using an underlying word alignment.
other,9-2-N03-2036,ak
</term>
are
<term>
blocks
</term>
- pairs of
<term>
phrases
</term>
. During
<term>
decoding
</term>
, we
#3427The units of translation are blocks - pairs ofphrases.
tech,1-3-N03-2036,ak
pairs of
<term>
phrases
</term>
. During
<term>
decoding
</term>
, we use a
<term>
block unigram model
#3430Duringdecoding, we use a block unigram model and a word-based trigram language model.
tech,1-4-N03-2036,ak
trigram language model
</term>
. During
<term>
training
</term>
, the
<term>
blocks
</term>
are learned
#3446Duringtraining, the blocks are learned from source interval projections using an underlying word alignment.
tech,11-1-N03-2036,ak
phrase-based unigram model
</term>
for
<term>
statistical machine translation
</term>
that uses a much simpler set of
<term>
#3401In this paper, we describe a phrase-based unigram model forstatistical machine translation that uses a much simpler set of model parameters than similar phrase-based models.