C92-2101 machine translation system performs word-for-word translation . That is , a TL sentence is
A00-1002 system uses the method of direct word-for-word translation , justified by the similarity
C94-1038 arbitrariness . It is well known that word-for-word translations are not the paradigmatic case
J12-3001 implementation of his earlier ideas on word-for-word translation . The results of translating
C02-2007 the bilingual concepts besides word-for-word translations . If we can manage it , not only
N09-1046 that also correspond to plausible word-for-word translations into English . Figure 2 shows
J99-4005 channel model offers deterministic , word-for-word translations , then the bigram source model
C69-4201 in parallel , in an equivalent word-for-word translation of the first language selected
C73-2031 decimal representations , since word-for-word translation ( and a few additional rules
C80-1080 full in the translation table ~ a word-for-word translation would probably be too confusing
C00-2159 translations wtrics greatly from word-for-word translations to short sunnnaries . in the
J12-3001 information . Yngve concluded that word-for-word translation could be taken as a first approximation
P00-1006 ; 2 ) choose w according to a word-for-word translation probability p ( w 18 ) . Summing
H94-1028 Although we argued above that word-for-word translation will not work to develop f from
A00-1002 ESILKO The greatest problem of the word-for-word translation approach ( for languages with
C67-1008 the minimum number of possible word-for-word translations into grammatical symbols . 4
C65-1027 . In model 1 . we may perform word-for-word translation by presenting an input corpus
C90-1016 of 2 ) , and assigns a default word-for-word translation in the case of 3 ) . 5 . Examples
H94-1018 view of performance , e.g. as in word-for-word translation or simple word indexing , the
C94-1070 on , WW.TRANS produces possible word-for-word translations , which are presented on screen
hide detail