C92-2095 eliminate unnecessary center - embedding /embedding/NN ; and ( 3 ) eliminating of scrambling
D15-1252 , penalizing embeddings , re - embedding /embedding/NN words , and dropout . We also
C80-1074 Fl ) , nominalization ( FII ) , embedding /embedding/NN ( fill ) , connecting ( FIV )
C86-1008 axiomatic theory of dialogue , embedding /embed/VBG rhetorical patterns , focusing
P10-1121 HHMM framework , a new metric , embedding /embedding/NN difference , is also proposed
P00-1011 select from a set of noun phrases , embedding /embed/VBG proper names of different semantic
W14-5201 pipelines with other researchers , embedding /embed/VBG NLP pipelines in applications
J82-3001 different . The intuitive notion of " embedding /embed/VBG a linguistic theory into a model
C82-2022 of PP 's , for which both the " embedding /embedding/NN " and the " same-level " hypotheses
D14-1030 some learned representation ( embedding /embedding/NN ) , and ( 3 ) compute output
W15-1511 1992 ) or the ( less well-known ) embedding /embedding/NN form given by the canonical correlation
W13-3207 introduce a new 50-dimensional embedding /embedding/NN obtained by spectral clustering
W05-0627 the largest probability among embedding /embed/VBG ones are kept . After predicting
J95-2003 constituent discourse segments ; an embedding /embedding/NN relationship may hold between
E89-1033 has been implemented , and an embedding /embedding/NN of this in an interactive parsing
P15-1107 that hierarchically builds an embedding /embedding/NN for a paragraph from embeddings
P15-1104 the supervised data to find an embedding /embedding/NN subspace that fits the task complexity
C88-1033 reformulated as the problem of finding an embedding /embedding/NN function f from the representational
P98-2242 given , which are realised in an embedding /embedding/NN algorithm . The significant aspect
P15-2048 labels . Specifically , we learn an embedding /embedding/NN for each label and each feature
hide detail