D14-1113 word sense discrimination and embedding /embedding/NN learning , by non-parametrically
D14-1167 Zheng Abstract We examine the embedding /embedding/NN approach to reason new relational
D14-1167 propose a novel method of jointly embedding /embed/VBG entities and words into the same
D14-1167 continuous vector space . The embedding /embedding/NN process attempts to preserve
D14-1167 Times corpus show that jointly embedding /embed/VBG brings promising improvement
D14-1167 facts , compared to separately embedding /embed/VBG knowledge graphs and text . Particularly
D14-1167 text . Particularly , jointly embedding /embed/VBG enables the prediction of facts
D14-1167 can not be handled by previous embedding /embedding/NN methods . At the same time ,
D14-1167 reasoning task show that jointly embedding /embed/VBG is comparable to or slightly
D15-1029 dot product between each word embedding /embedding/NN and part of the first hidden
D15-1029 network architecture where the embedding /embedding/NN layer feeds into multiple hidden
D15-1031 We study the problem of jointly embedding /embed/VBG a knowledge base and a text corpus
D15-1031 dependency on anchors . We require the embedding /embedding/NN vector of an entity not only
D15-1031 KBs but also to be equal to the embedding /embedding/NN vector computed from the text
D15-1034 relationships as translations in the embedding /embedding/NN space , have shown promising
D15-1036 evaluation methods for unsupervised embedding /embedding/NN techniques that obtain meaningful
D15-1036 result in different orderings of embedding /embedding/NN methods , calling into question
D15-1038 completion impute missing facts by embedding /embed/VBG knowledge graphs in vector spaces
D15-1054 is to explore the use of word embedding /embedding/NN techniques to generate effective
D15-1054 application of conventional word embedding /embedding/NN methodologies for ad click prediction
hide detail