other,30-3-N03-1017,ak : <term> heuristic learning </term> of <term> phrase translations </term> from <term> word-based
other,39-3-N03-1017,ak and <term> lexical weighting </term> of <term> phrase translations </term> . Surprisingly , learning
other,3-4-N03-1017,ak translations </term> . Surprisingly , learning <term> phrases </term> longer than three <term> words </term>
other,10-4-N03-1017,ak three <term> words </term> and learning <term> phrases </term> from <term> high-accuracy word-level
other,2-5-N03-1017,ak Learning only <term> syntactically motivated phrases </term> degrades the performance of our systems
other,3-2-N03-2017,ak </term> . It requires disjoint <term> English phrases </term> to be mapped to non-overlapping intervals
other,9-2-N03-2036,ak </term> are <term> blocks </term> - pairs of <term> phrases </term> . During <term> decoding </term> , we
other,13-5-N03-2036,ak based on <term> unigram counts </term> and <term> phrase length </term> . In this paper , we propose
other,26-5-H05-1005,ak redundancy </term> , focusing on <term> noun phrases </term> . This paper presents a <term> maximum
other,12-1-H05-1095,ak method </term> , based on <term> non-contiguous phrases </term> , i.e. <term> phrases </term> with gaps
other,16-1-H05-1095,ak non-contiguous phrases </term> , i.e. <term> phrases </term> with gaps . A method for producing
other,5-2-H05-1095,ak gaps . A method for producing such <term> phrases </term> from a <term> word-aligned corpora </term>
other,10-3-H05-1095,ak </term> is also presented that deals such <term> phrases </term> , as well as a <term> training method
other,22-1-P05-1032,ak retrieval </term> of arbitrarily long <term> phrases </term> while simultaneously using less <term>
other,12-2-P05-1032,ak retrieval times </term> for looking up <term> phrase translations </term> in our <term> suffix array-based
other,6-3-P05-1053,ak Our study illustrates that the base <term> phrase chunking information </term> is very effective
tech,8-2-P05-1069,ak with orientation </term> to handle <term> local phrase re-ordering </term> . We use a <term> maximum
other,21-3-P05-1074,ak language can be identified using a <term> phrase </term> in another language as a pivot .
other,67-5-E06-1035,ak conversational cues </term> , such as <term> cue phrases </term> and <term> overlapping speech </term>
lr,4-1-P83-1004,ak formalisms </term> that combine <term> context-free phrase structure rules and metarules ( MPS grammars
hide detail