E06-1006 were found in combination with a word-to-word translation model . Most of these techniques
E03-1004 line , labeled 0 , shows their word-to-word translations . The remaining lines correspond
C92-2069 machine translation which used word-to-word translation probability between two languages
D14-1098 language P ( wi ) . Instead of word-to-word translation , the transduction of the context
E03-1050 Brown et al. , 1993 ) is based on word-to-word translations . The SMT system used in this
C04-1117 order to learn English and German word-to-word translations . His approach is based on similarity
C02-1011 language . To do so , we use a word-to-word translation dictionary and corpus data in
D14-1024 translation ( SMT ) where the word-to-word translation of MWEs often results in wrong
J03-1008 robustness , and broad coverage . The word-to-word translation component simply uses rules to
E03-1050 translation system uses not only word-to-word translations but also phrase transla - tions
E97-1063 ) . We define the recall of a word-to-word translation model as the fraction of the
J00-2004 methods was used to estimate a word-to-word translation model from the 29,614 verse pairs
E14-2011 , the results are unidiomatic word-to-word translations but still grammatical . For instance
J03-1005 translator , who is asked to produce word-to-word translations wherever it is possible . Part
D15-1245 because it does not undergo the word-to-word translation of the other data sets . Argument
D14-1061 word alignment models contain word-to-word translation probabilities t ( f | e ) , having
D08-1090 resulting in the generation of new word-to-word translations . More recently , Resnik and
E03-1004 on lexical substitution using word-to-word translation dictionaries enhanced by the
D13-1173 iterations on each and build a word-to-word translation lexicon Tdecipher by collecting
C92-2069 opportunity to discover the optimal word-to-word translation probability , according to the
hide detail