C80-1074 be the nominalization and the embedding operator respectively . An arbitrary
A00-2015 subordinate clauses , in which scope embedding preference of subordinate clauses
D12-1086 based on Euclidean co-occurrence embedding combines the paradigmatic context
C90-2071 considerations arising from the embedding phrase and context . Figure 1
C80-1074 operator ( Fill , FII 2 and FI21 ) , embedding operator fill and connecting
C92-3137 re-Evaluation of the attitude in the embedding attitude contexts . Thus , ill
C92-2095 eliminate unnecessary center - embedding ; and ( 3 ) eliminating of scrambling
C80-1074 Fl ) , nominalization ( FII ) , embedding ( fill ) , connecting ( FIV )
C92-2072 thus , we would include multiple embedding constructions , poten - ACT ,
C82-2022 of PP 's , for which both the " embedding " and the " same-level " hypotheses
D14-1012 prototype approach , for utilizing the embedding features . The presented approaches
D14-1012 approaches can better utilize the word embedding features , among which the distributional
D14-1012 outperforming the dense and continuous embedding features by nearly 2 points of
D14-1015 investigate how to improve bilingual embedding which has been successfully used
D14-1015 translation ( SMT ) . Despite bilingual embedding 's success , the contextual information
D14-1015 memory-efficient model for learning bilingual embedding , taking both the source phrase
D14-1015 generated from our proposed bilingual embedding model are used as features in
D14-1030 some learned representation ( embedding ) , and ( 3 ) compute output
D14-1030 Sec - ondly , the neural network embedding of word i can predict the MEG
D14-1062 relevance for the in-domain task . By embedding our latent domain phrase model
hide detail