J98-4006 especially those involving central embedding /embedding/NN ( recursion ) , was one of the
P14-1146 learning sentiment - specific word embedding /embedding/NN ( SSWE ) , which encodes sentiment
S14-2033 concatenating the sentiment-specific word embedding /embedding/NN ( SSWE ) features with the state-of-the-art
D14-1030 some learned representation ( embedding /embedding/NN ) , and ( 3 ) compute output
D15-1205 models build a representation ( or embedding /embedding/NN ) for a linguistic structure
D15-1031 We study the problem of jointly embedding /embed/VBG a knowledge base and a text corpus
J82-3001 different . The intuitive notion of " embedding /embed/VBG a linguistic theory into a model
T78-1035 recognized the desirability of embedding /embed/VBG a theory of grammar within a
P15-2098 ranking . We show that radical embedding /embedding/NN achieves com - parable , and
D15-1306 and incorporating frame-semantic embedding /embedding/NN achieves the best overall performance
E09-3009 Vector Space Model ( VSM ) by embedding /embed/VBG additional types of information
J14-2006 hidden in a cover text using the embedding /embedding/NN algorithm , resulting in the
P98-2242 given , which are realised in an embedding /embedding/NN algorithm . The significant aspect
D15-1246 there has been a surge of word embedding /embedding/NN algorithms and research on them
P98-1016 of the clustering is a large CG embedding /embed/VBG all individual graphs . In the
W15-1303 relies on the notion of semantic embedding /embedding/NN and a fine-grained classification
J89-1005 English that contains left and right embedding /embedding/NN and finite central embedding
N06-4008 makes mistakes ) . Ndaona includes embedding /embedding/NN and graphics parameter estimation
N12-1049 rules , the possibility of phrasal embedding /embedding/NN and modification in time expressions
W10-4104 PClauses indicating the depth of the embedding /embedding/NN and overlapping . PClause " fTfT
hide detail