C86-1088 first-order model structure . A proper embedding is a function from 1J / ~ to
D15-1200 relat - edness , controlling for embedding dimen - sionality . We find that
C88-1033 reformulated as the problem of finding an embedding function f from the representational
S15-2092 hand-crafted features and message-level embedding fea - tures , and uses an SVM
S15-2155 work on using vector space word embedding models for hypernym-hyponym extraction
W15-3822 method called Surrounding based embedding feature ( SBE ) , and two newly
W10-4104 PClauses indicating the depth of the embedding and overlapping . PClause " fTfT
D15-1038 completion impute missing facts by embedding knowledge graphs in vector spaces
S15-2094 traditional features and word embedding features to perform sentiment
P10-1121 HHMM framework , a new metric , embedding difference , is also proposed
P97-1060 logic and tree automata and the embedding of MSO logic into a constraint
S14-2011 provides dense , low-dimensional embedding for each fragment which allows
P15-1025 with the variable size of word embedding vec - tors , we employ the framework
C80-1074 operator ( Fill , FII 2 and FI21 ) , embedding operator fill and connecting
D15-1191 proposes context-dependent KG embedding , a twostage scheme that takes
D15-1054 is to explore the use of word embedding techniques to generate effective
W15-3822 : Left-Right surrounding based embedding feature ( LR_SBE ) and MAX surrounding
J10-3010 extrinsic evaluation is done by embedding the expansion systems into a
D14-1167 Zheng Abstract We examine the embedding approach to reason new relational
D14-1113 word sense discrimination and embedding learning , by non-parametrically
hide detail