D15-1191 proposes context-dependent KG embedding , a twostage scheme that takes
J80-1001 grammars have a straightforward embedding , but which permit various transformations
W15-3820 representations , also known as word embedding , have been shown to improve
J00-3002 stank . Discussing such center embedding , Johnson ( 1998 ) presents the
C00-1055 functions such as the level of embedding , number of parentheses , and
W14-4002 aspects of the direct context ( an embedding , parent phrase pair ) of the
J00-3002 the unacceptability of center embedding , preference for lower attach
P13-2087 that takes as input an existing embedding , some labeled data , and produces
D14-1015 memory-efficient model for learning bilingual embedding , taking both the source phrase
W15-4310 representation . Besides word embedding , we use partof-speech ( POS
W10-4104 other examples of this section are embedding , while this example is of overlapping
C92-2095 eliminate unnecessary center - embedding ; and ( 3 ) eliminating of scrambling
P84-1007 manifests any greater degree of center embedding ; hence , the affixed strings
P84-1007 manifests first-degree center embedding . In E3 , the included VP saw
C80-1009 construction capable of unlimited embedding . The results of this treatment
J89-1005 right embedding and finite central embedding . They argue that " the theory
W11-1827 that is based on the notion of embedding . We apply our methodology to
C82-2022 of PP 's , for which both the " embedding " and the " same-level " hypotheses
C80-1074 Fl ) , nominalization ( FII ) , embedding ( fill ) , connecting ( FIV )
W15-1504 The method , Instance-context embedding ( ICE ) , leverages neural word
hide detail