D14-1030 some learned representation ( embedding ) , and ( 3 ) compute output
D14-1062 relevance for the in-domain task . By embedding our latent domain phrase model
D14-1113 word sense discrimination and embedding learning , by non-parametrically
D14-1167 Zheng Abstract We examine the embedding approach to reason new relational
D15-1029 dot product between each word embedding and part of the first hidden
D15-1031 We study the problem of jointly embedding a knowledge base and a text corpus
D15-1034 relationships as translations in the embedding space , have shown promising
D15-1036 evaluation methods for unsupervised embedding techniques that obtain meaningful
D15-1038 completion impute missing facts by embedding knowledge graphs in vector spaces
D15-1054 is to explore the use of word embedding techniques to generate effective
D15-1098 component-enhanced Chinese character embedding models and their bigram extensions
D15-1153 previous work on integrating word embedding features into a discrete linear
D15-1183 Chunyan Abstract Most existing word embedding methods can be categorized into
D15-1191 Abstract We consider the problem of embedding knowledge graphs ( KGs ) into
D15-1196 outperforms the state-of-the-art word embedding methods in both representation
D15-1200 paper we introduce a multisense embedding model based on Chinese Restaurant
D15-1205 </title> R Abstract Compositional embedding models build a representation
D15-1232 words , where we assume that an embedding of each word can represent its
D15-1246 there has been a surge of word embedding algorithms and research on them
D15-1252 , penalizing embeddings , re - embedding words , and dropout . We also
hide detail