C92-2095 ( = S ) on tile right and so eliminate unnecessary center - embedding ; and ( 3 ) eliminating of scrambling and NP drop to isolate tile separate effects of llead-final ( e.g. , Verb-final ) l ) hrase structure in Japanese .
D15-1252 We tried several frequently applied or newly proposed regularization strategies , including penalizing weights ( embeddings excluded ) , penalizing embeddings , re - embedding words , and dropout .
C80-1074 They are classified into six groups , that is , modal ( Fl ) , nominalization ( FII ) , embedding ( fill ) , connecting ( FIV ) , elliptical ( F V ) and anaphoric operator ( Fvl ) .
C86-1008 The importance of such a structural description , if attained , is that it would make possible an axiomatic theory of dialogue , embedding rhetorical patterns , focusing , and focus shifting .
P10-1121 Besides defining standard metrics in the HHMM framework , a new metric , embedding difference , is also proposed , which tests the hypothesis that HHMM store elements represents syntactic working memory .
P00-1011 This paper describes a supervised learning method to automatically select from a set of noun phrases , embedding proper names of different semantic classes , their most distinctive features .
W14-5201 Contrary to other recent endeavors that rely heavily on web services , our collection consists only of portable components distributed via a repository , making it particularly interesting with respect to sharing pipelines with other researchers , embedding NLP pipelines in applications , and the use on high-performance computing clusters .
J82-3001 The intuitive notion of " embedding a linguistic theory into a model of language use " as it is generally construed is much stronger than this , since it implies that the parsing system follows some ( perhaps all ) of the same operating principles as the linguistic system , and makes reference in its operation to the same system of rules .
C82-2022 - I01 - 1.2 , Another problem is proposed by the structural ambi - 8 ~ ity of sequences of PP 's , for which both the " embedding " and the " same-level " hypotheses are presented !
D14-1030 Many statistical models for natural language processing exist , including context-based neural networks that ( 1 ) model the previously seen context as a latent feature vector , ( 2 ) integrate successive words into the context using some learned representation ( embedding ) , and ( 3 ) compute output probabilities for incoming words given the context .
W15-1511 Using either the bit-string form given by the algorithm of Brown et al. ( 1992 ) or the ( less well-known ) embedding form given by the canonical correlation analysis algorithm of Stratos et al. ( 2014 ) , we can obtain 93 % tagging accuracy with just 400 labeled words and achieve state-of-the-art accuracy ( > 97 % ) with less than 1 percent of the original training data .
W13-3207 We introduce a new 50-dimensional embedding obtained by spectral clustering of a graph describing the conceptual structure of the lexicon .
W05-0627 Only the constituents with the largest probability among embedding ones are kept .
J95-2003 At the level of linguistic structure , discourses divide into constituent discourse segments ; an embedding relationship may hold between two segments .
E89-1033 An incremental chart parser embodying the ideas put forward in this paper has been implemented , and an embedding of this in an interactive parsing system is near completion .
P15-1107 We introduce an LSTM model that hierarchically builds an embedding for a paragraph from embeddings for sentences and words , then decodes this embedding to reconstruct the original paragraph .
P15-1104 To overcome this issue , we use the supervised data to find an embedding subspace that fits the task complexity .
C88-1033 The problem of classification can be reformulated as the problem of finding an embedding function f from the representational entities onto the domain of a model .
P98-2242 Two rules for the content determination and construction of the non-referring part are given , which are realised in an embedding algorithm .
P15-2048 Specifically , we learn an embedding for each label and each feature such that labels which frequently co-occur are close in the embedded space .
hide detail