tech,0-1-P05-1067,ak Experimental results show that our approach improves <term> domain-specific word alignment </term> in terms of both <term> precision </term> and <term> recall </term> , achieving a <term> relative error rate reduction </term> of 6.56 % as compared with the state-of-the-art technologies . <term> Syntax-based statistical machine translation ( MT ) </term> aims at applying <term> statistical models </term> to <term> structured data </term> .
model,10-1-P05-1067,ak <term> Syntax-based statistical machine translation ( MT ) </term> aims at applying <term> statistical models </term> to <term> structured data </term> .
other,13-1-P05-1067,ak <term> Syntax-based statistical machine translation ( MT ) </term> aims at applying <term> statistical models </term> to <term> structured data </term> .
tech,7-2-P05-1067,ak In this paper , we present a <term> syntax-based statistical machine translation system </term> based on a <term> probabilistic synchronous dependency insertion grammar </term> .
lr,15-2-P05-1067,ak In this paper , we present a <term> syntax-based statistical machine translation system </term> based on a <term> probabilistic synchronous dependency insertion grammar </term> .
lr,0-3-P05-1067,ak In this paper , we present a <term> syntax-based statistical machine translation system </term> based on a <term> probabilistic synchronous dependency insertion grammar </term> . <term> Synchronous dependency insertion grammars </term> are a version of <term> synchronous grammars </term> defined on <term> dependency trees </term> .
lr,8-3-P05-1067,ak <term> Synchronous dependency insertion grammars </term> are a version of <term> synchronous grammars </term> defined on <term> dependency trees </term> .
lr,12-3-P05-1067,ak <term> Synchronous dependency insertion grammars </term> are a version of <term> synchronous grammars </term> defined on <term> dependency trees </term> .
lr,9-4-P05-1067,ak We first introduce our approach to inducing such a <term> grammar </term> from <term> parallel corpora </term> .
lr,11-4-P05-1067,ak We first introduce our approach to inducing such a <term> grammar </term> from <term> parallel corpora </term> .
model,5-5-P05-1067,ak Second , we describe the <term> graphical model </term> for the <term> machine translation task </term> , which can also be viewed as a <term> stochastic tree-to-tree transducer </term> .
other,9-5-P05-1067,ak Second , we describe the <term> graphical model </term> for the <term> machine translation task </term> , which can also be viewed as a <term> stochastic tree-to-tree transducer </term> .
tech,20-5-P05-1067,ak Second , we describe the <term> graphical model </term> for the <term> machine translation task </term> , which can also be viewed as a <term> stochastic tree-to-tree transducer </term> .
tech,3-6-P05-1067,ak We introduce a <term> polynomial time decoding algorithm </term> for the <term> model </term> .
model,9-6-P05-1067,ak We introduce a <term> polynomial time decoding algorithm </term> for the <term> model </term> .
tech,6-7-P05-1067,ak We evaluate the outputs of our <term> MT system </term> using the <term> NIST </term> and <term> Bleu automatic MT evaluation software </term> .
measure(ment),10-7-P05-1067,ak We evaluate the outputs of our <term> MT system </term> using the <term> NIST </term> and <term> Bleu automatic MT evaluation software </term> .
measure(ment),12-7-P05-1067,ak We evaluate the outputs of our <term> MT system </term> using the <term> NIST </term> and <term> Bleu automatic MT evaluation software </term> .
tech,8-8-P05-1067,ak The result shows that our system outperforms the <term> baseline system </term> based on the <term> IBM models </term> in both <term> translation speed and quality </term> .
model,13-8-P05-1067,ak The result shows that our system outperforms the <term> baseline system </term> based on the <term> IBM models </term> in both <term> translation speed and quality </term> .
hide detail