C90-2071 considerations arising from the embedding phrase and context . Figure 1
C90-3012 on various types of syntactic embedding constructions in Germanic languages
C92-2072 thus , we would include multiple embedding constructions , poten - ACT ,
C92-2095 eliminate unnecessary center - embedding ; and ( 3 ) eliminating of scrambling
C92-3137 re-Evaluation of the attitude in the embedding attitude contexts . Thus , ill
C96-1045 indirectly in terms of a homomorphic embedding into Quasi Logical Form ( QLF
D08-1086 investigated , and then a LVCSR system embedding the presented analyzer is evaluated
D10-1116 thus maintaining a reasonable embedding capacity . 1 Introduction Steganography
D12-1086 based on Euclidean co-occurrence embedding combines the paradigmatic context
D14-1012 effectively incorporating the word embedding features within the framework
D14-1012 prototype approach , for utilizing the embedding features . The presented approaches
D14-1012 approaches can better utilize the word embedding features , among which the distributional
D14-1012 outperforming the dense and continuous embedding features by nearly 2 points of
D14-1015 investigate how to improve bilingual embedding which has been successfully used
D14-1015 translation ( SMT ) . Despite bilingual embedding 's success , the contextual information
D14-1015 memory-efficient model for learning bilingual embedding , taking both the source phrase
D14-1015 generated from our proposed bilingual embedding model are used as features in
D14-1030 some learned representation ( embedding ) , and ( 3 ) compute output
D14-1030 Sec - ondly , the neural network embedding of word i can predict the MEG
D14-1062 relevance for the in-domain task . By embedding our latent domain phrase model
hide detail