other,13-2-P05-1057,ak functions </term> , which depend on the <term> source language sentence </term> , the <term> target language sentence
model,6-4-P05-1057,ak information </term> . In this paper , we use <term> IBM Model 3 alignment properties </term> , <term> POS correspondence </term> ,
model,12-4-P05-1057,ak Model 3 alignment properties </term> , <term> POS correspondence </term> , and <term> bilingual dictionary coverage
tech,8-5-P05-1057,ak models </term> significantly outperform <term> IBM translation models </term> . This paper proposes an <term> alignment
tech,4-5-P05-1057,ak </term> . Our experiments show that <term> log-linear models </term> significantly outperform <term> IBM
other,18-2-P05-1057,ak source language sentence </term> , the <term> target language sentence </term> and possible additional <term> variables
other,20-4-P05-1057,ak bilingual dictionary coverage </term> as <term> features </term> . Our experiments show that <term>
tech,9-1-P05-1057,ak <term> word alignment </term> based on <term> log-linear models </term> . All <term> knowledge sources </term>
other,24-2-P05-1057,ak sentence </term> and possible additional <term> variables </term> . <term> Log-linear models </term> allow
other,12-3-P05-1057,ak be easily extended by incorporating <term> syntactic information </term> . In this paper , we use <term> IBM
tech,6-2-P05-1057,ak knowledge sources </term> are treated as <term> feature functions </term> , which depend on the <term> source
tech,0-3-P05-1057,ak additional <term> variables </term> . <term> Log-linear models </term> allow <term> statistical alignment
tech,3-3-P05-1057,ak <term> Log-linear models </term> allow <term> statistical alignment models </term> to be easily extended by incorporating
other,1-2-P05-1057,ak <term> log-linear models </term> . All <term> knowledge sources </term> are treated as <term> feature functions
tech,5-1-P05-1057,ak </term> . We present a framework for <term> word alignment </term> based on <term> log-linear models </term>
model,16-4-P05-1057,ak <term> POS correspondence </term> , and <term> bilingual dictionary coverage </term> as <term> features </term> . Our experiments
hide detail