D15-1191 This paper proposes context-dependent KG embedding , a twostage scheme that takes into account both types of connectivity patterns and obtains more accurate embeddings .
J80-1001 ATN 's have the advantage of being a class of automata into which ordinary context-free phrase structure and " augmented " phrase structure grammars have a straightforward embedding , but which permit various transformations to be performed to produce grammars that can be more efficient than the original .
W15-3820 Such word vector representations , also known as word embedding , have been shown to improve the performance of machine learning models in several NLP tasks .
J00-3002 Discussing such center embedding , Johnson ( 1998 ) presents the essential idea developed here , noting that processing overload of dependencies invoked in psycholinguistic literature could be rendered in terms of the maximal number of unresolved dependencies as represented by proof nets .
C00-1055 We have experimented with lnore elaborate functions that indicate how balanced the parse tree is and less complicated functions such as the level of embedding , number of parentheses , and so oil .
W14-4002 Inspired by work on parsing ( Klein and Man - ning , 2003 ) , we explore a vertical Markovian labeling approach : intuitively , 0th-order labels signify the reordering of the sub-phrases inside the phrase pair ( Zhang et al. , 2008 ) , 1st-order labels signify reordering aspects of the direct context ( an embedding , parent phrase pair ) of the phrase pair , and so on .
J00-3002 We argue that an incremental procedure of proof net construction affords an account of various processing phenomena , including garden pathing , the unacceptability of center embedding , preference for lower attach - ment , left-to-right quantifier scope preference , and heavy noun phrase shift .
P13-2087 We propose a method that takes as input an existing embedding , some labeled data , and produces an embedding in the same space , but with a better predictive performance in the supervised task .
D14-1015 To employ the contextual information , we propose a simple and memory-efficient model for learning bilingual embedding , taking both the source phrase and context around the phrase into account .
W15-4310 Besides word embedding , we use partof-speech ( POS ) tags , chunks , and brown clusters induced from Wikipedia as fea - tures .
W10-4104 The discourse structures in other examples of this section are embedding , while this example is of overlapping type .
C92-2095 ( = S ) on tile right and so eliminate unnecessary center - embedding ; and ( 3 ) eliminating of scrambling and NP drop to isolate tile separate effects of llead-final ( e.g. , Verb-final ) l ) hrase structure in Japanese .
P84-1007 However , no derivation of an affixed string generated by G2 * manifests any greater degree of center embedding ; hence , the affixed strings associated with the expressions of L2 can still be assigned to them by a finite-state parser .
P84-1007 Moreover , under each interpretation , each of these sentences manifests first-degree center embedding .
C80-1009 As he points out , this is a simple and natural way of treating any construction capable of unlimited embedding .
J89-1005 The second-longest paper , " On the design of finite transducers for parsing phrase-structure languages " by Langendoen and Langsam , defines a finite transducer that recognizes a context-free fragment of English that contains left and right embedding and finite central embedding .
W11-1827 However , diverging from that system 's more pragmatic na - ture , we more clearly distinguish the shared task concerns from a general semantic composition scheme , that is based on the notion of embedding .
C82-2022 - I01 - 1.2 , Another problem is proposed by the structural ambi - 8 ~ ity of sequences of PP 's , for which both the " embedding " and the " same-level " hypotheses are presented !
C80-1074 They are classified into six groups , that is , modal ( Fl ) , nominalization ( FII ) , embedding ( fill ) , connecting ( FIV ) , elliptical ( F V ) and anaphoric operator ( Fvl ) .
W15-1504 The method , Instance-context embedding ( ICE ) , leverages neural word embed - dings , and the correlation statistics they cap - ture , to compute high quality embeddings of word contexts .
hide detail