J98-4006 especially those involving central embedding ( recursion ) , was one of the
P14-1146 learning sentiment - specific word embedding ( SSWE ) , which encodes sentiment
S14-2033 concatenating the sentiment-specific word embedding ( SSWE ) features with the state-of-the-art
D14-1030 some learned representation ( embedding ) , and ( 3 ) compute output
D15-1205 models build a representation ( or embedding ) for a linguistic structure
D15-1031 We study the problem of jointly embedding a knowledge base and a text corpus
J82-3001 different . The intuitive notion of " embedding a linguistic theory into a model
T78-1035 recognized the desirability of embedding a theory of grammar within a
P15-2098 ranking . We show that radical embedding achieves com - parable , and
D15-1306 and incorporating frame-semantic embedding achieves the best overall performance
E09-3009 Vector Space Model ( VSM ) by embedding additional types of information
J14-2006 hidden in a cover text using the embedding algorithm , resulting in the
P98-2242 given , which are realised in an embedding algorithm . The significant aspect
D15-1246 there has been a surge of word embedding algorithms and research on them
P98-1016 of the clustering is a large CG embedding all individual graphs . In the
W15-1303 relies on the notion of semantic embedding and a fine-grained classification
J89-1005 English that contains left and right embedding and finite central embedding
N06-4008 makes mistakes ) . Ndaona includes embedding and graphics parameter estimation
N12-1049 rules , the possibility of phrasal embedding and modification in time expressions
W10-4104 PClauses indicating the depth of the embedding and overlapping . PClause " fTfT
hide detail