D14-1030 some learned representation ( embedding /embedding/NN ) , and ( 3 ) compute output
D14-1062 relevance for the in-domain task . By embedding /embed/VBG our latent domain phrase model
D14-1113 word sense discrimination and embedding /embedding/NN learning , by non-parametrically
D14-1167 Zheng Abstract We examine the embedding /embedding/NN approach to reason new relational
D15-1029 dot product between each word embedding /embedding/NN and part of the first hidden
D15-1031 We study the problem of jointly embedding /embed/VBG a knowledge base and a text corpus
D15-1034 relationships as translations in the embedding /embedding/NN space , have shown promising
D15-1036 evaluation methods for unsupervised embedding /embedding/NN techniques that obtain meaningful
D15-1038 completion impute missing facts by embedding /embed/VBG knowledge graphs in vector spaces
D15-1054 is to explore the use of word embedding /embedding/NN techniques to generate effective
D15-1098 component-enhanced Chinese character embedding /embedding/NN models and their bigram extensions
D15-1153 previous work on integrating word embedding /embedding/NN features into a discrete linear
D15-1183 Chunyan Abstract Most existing word embedding /embedding/NN methods can be categorized into
D15-1191 Abstract We consider the problem of embedding /embedding/NN knowledge graphs ( KGs ) into
D15-1196 outperforms the state-of-the-art word embedding /embedding/NN methods in both representation
D15-1200 paper we introduce a multisense embedding /embedding/NN model based on Chinese Restaurant
D15-1205 </title> R Abstract Compositional embedding /embedding/NN models build a representation
D15-1232 words , where we assume that an embedding /embedding/NN of each word can represent its
D15-1246 there has been a surge of word embedding /embedding/NN algorithms and research on them
D15-1252 , penalizing embeddings , re - embedding /embedding/NN words , and dropout . We also
hide detail