J13-1008 complex syntactic patterns , and embedding /embedding/NN useful morphological features
D15-1031 We study the problem of jointly embedding /embed/VBG a knowledge base and a text corpus
C82-2022 of PP 's , for which both the " embedding /embedding/NN " and the " same-level " hypotheses
D15-1191 Abstract We consider the problem of embedding /embedding/NN knowledge graphs ( KGs ) into
W05-0627 the largest probability among embedding /embed/VBG ones are kept . After predicting
P14-1011 learns how to transform semantic embedding /embedding/NN space in one language to the
C86-1088 who owns a book reads it . The embedding /embedding/NN rule for = > - conditions
J10-3010 extrinsic evaluation is done by embedding /embed/VBG the expansion systems into a
P15-1107 and words , then decodes this embedding /embedding/NN to reconstruct the original paragraph
H91-1024 there is , for example , much more embedding /embedding/NN of requests in hypotheticals
J80-1001 grammars have a straightforward embedding /embedding/NN , but which permit various transformations
D14-1062 relevance for the in-domain task . By embedding /embed/VBG our latent domain phrase model
J09-1002 TransType ideas , the innovative embedding /embedding/NN proposed here consists in using
C80-1009 construction capable of unlimited embedding /embedding/NN . The results of this treatment
D15-1205 </title> R Abstract Compositional embedding /embedding/NN models build a representation
D15-1036 result in different orderings of embedding /embedding/NN methods , calling into question
D15-1054 application of conventional word embedding /embedding/NN methodologies for ad click prediction
P15-1125 large-scale knowledge bases . The novel embedding /embedding/NN model associates each category
P15-1009 lie close to each other in the embedding /embedding/NN space . Two manifold learning
J82-3001 different . The intuitive notion of " embedding /embed/VBG a linguistic theory into a model
hide detail