S15-2092 hand-crafted features and message-level embedding fea - tures , and uses an SVM
J03-3003 we employed and various ways of embedding translation into a retrieval
C92-2072 thus , we would include multiple embedding constructions , poten - ACT ,
P00-1011 select from a set of noun phrases , embedding proper names of different semantic
W03-0603 visually-grounded semantics and their embedding in a compositional parsing frame
W15-3814 literature . Recent advances in word embedding make computation of word distribution
D12-1086 based on Euclidean co-occurrence embedding combines the paradigmatic context
S15-2085 , word prior polarities , and embedding clusters . Using weighted Support
D15-1034 relationships as translations in the embedding space , have shown promising
P84-1007 manifests first-degree center embedding of the category S * , as a result
T87-1012 they all assiduously avoid center embedding in favor of strongly left - or
D15-1252 , penalizing embeddings , re - embedding words , and dropout . We also
S14-2033 concatenating the sentiment-specific word embedding ( SSWE ) features with the state-of-the-art
C80-1009 construction capable of unlimited embedding . The results of this treatment
P13-1078 translation model . In addition , word embedding is employed as the input to the
J14-2006 and Bob must find some way for embedding hidden information into their
J88-2001 event-related information from text and embedding those methods in question-answering
D14-1012 effectively incorporating the word embedding features within the framework
D15-1183 Chunyan Abstract Most existing word embedding methods can be categorized into
W00-0507 Abstract This paper describes the embedding of a statistical translation
hide detail