C92-2095 eliminate unnecessary center - embedding ; and ( 3 ) eliminating of scrambling
D15-1252 , penalizing embeddings , re - embedding words , and dropout . We also
C80-1074 Fl ) , nominalization ( FII ) , embedding ( fill ) , connecting ( FIV )
C86-1008 axiomatic theory of dialogue , embedding rhetorical patterns , focusing
P10-1121 HHMM framework , a new metric , embedding difference , is also proposed
P00-1011 select from a set of noun phrases , embedding proper names of different semantic
W14-5201 pipelines with other researchers , embedding NLP pipelines in applications
J82-3001 different . The intuitive notion of " embedding a linguistic theory into a model
C82-2022 of PP 's , for which both the " embedding " and the " same-level " hypotheses
D14-1030 some learned representation ( embedding ) , and ( 3 ) compute output
W15-1511 1992 ) or the ( less well-known ) embedding form given by the canonical correlation
W13-3207 introduce a new 50-dimensional embedding obtained by spectral clustering
W05-0627 the largest probability among embedding ones are kept . After predicting
J95-2003 constituent discourse segments ; an embedding relationship may hold between
E89-1033 has been implemented , and an embedding of this in an interactive parsing
P15-1107 that hierarchically builds an embedding for a paragraph from embeddings
P15-1104 the supervised data to find an embedding subspace that fits the task complexity
C88-1033 reformulated as the problem of finding an embedding function f from the representational
P98-2242 given , which are realised in an embedding algorithm . The significant aspect
P15-2048 labels . Specifically , we learn an embedding for each label and each feature
hide detail