C92-2095 |
( = S ) on tile right and so eliminate unnecessary center -
embedding
; and ( 3 ) eliminating of scrambling and NP drop to isolate tile separate effects of llead-final ( e.g. , Verb-final ) l ) hrase structure in Japanese .
|
D15-1252 |
We tried several frequently applied or newly proposed regularization strategies , including penalizing weights ( embeddings excluded ) , penalizing embeddings , re -
embedding
words , and dropout .
|
C80-1074 |
Then , the modal operator ( Fill , FII 2 and FI21 ) ,
embedding
operator fill and connecting operator FIV are extracted by investigating the variety and the inflectional form of the predicate or the words which follow the predicate .
|
C80-1074 |
They are classified into six groups , that is , modal ( Fl ) , nominalization ( FII ) ,
embedding
( fill ) , connecting ( FIV ) , elliptical ( F V ) and anaphoric operator ( Fvl ) .
|
C86-1008 |
The importance of such a structural description , if attained , is that it would make possible an axiomatic theory of dialogue ,
embedding
rhetorical patterns , focusing , and focus shifting .
|
P10-1121 |
Besides defining standard metrics in the HHMM framework , a new metric ,
embedding
difference , is also proposed , which tests the hypothesis that HHMM store elements represents syntactic working memory .
|
P00-1011 |
This paper describes a supervised learning method to automatically select from a set of noun phrases ,
embedding
proper names of different semantic classes , their most distinctive features .
|
W14-5201 |
Contrary to other recent endeavors that rely heavily on web services , our collection consists only of portable components distributed via a repository , making it particularly interesting with respect to sharing pipelines with other researchers ,
embedding
NLP pipelines in applications , and the use on high-performance computing clusters .
|
H90-1008 |
tree transformation : ( futr 9 ) " r = T ( ~ " ( O \ r ) ) This is quite analogous to pres and past , except that the temporal location of the new episode e T is specified relative to the episode 7 / ( usually a present or past episode ) characterized by " having 9 true in its future , " rather than relative to an "
embedding
episode . "
|
J82-3001 |
putational unruliness , in the sense that it is claimed that there is a " natural "
embedding
of an LFG into a parsing mechanism ( a performance model ) that accounts for human sentence processing behavior .
|
J82-3001 |
The intuitive notion of "
embedding
a linguistic theory into a model of language use " as it is generally construed is much stronger than this , since it implies that the parsing system follows some ( perhaps all ) of the same operating principles as the linguistic system , and makes reference in its operation to the same system of rules .
|
C82-2022 |
- I01 - 1.2 , Another problem is proposed by the structural ambi - 8 ~ ity of sequences of PP 's , for which both the "
embedding
" and the " same-level " hypotheses are presented !
|
D14-1030 |
Many statistical models for natural language processing exist , including context-based neural networks that ( 1 ) model the previously seen context as a latent feature vector , ( 2 ) integrate successive words into the context using some learned representation (
embedding
) , and ( 3 ) compute output probabilities for incoming words given the context .
|
W15-1511 |
Using either the bit-string form given by the algorithm of Brown et al. ( 1992 ) or the ( less well-known )
embedding
form given by the canonical correlation analysis algorithm of Stratos et al. ( 2014 ) , we can obtain 93 % tagging accuracy with just 400 labeled words and achieve state-of-the-art accuracy ( > 97 % ) with less than 1 percent of the original training data .
|
W13-3207 |
We introduce a new 50-dimensional
embedding
obtained by spectral clustering of a graph describing the conceptual structure of the lexicon .
|
D15-1246 |
We show that all
embedding
approaches behave similarly in this task , with dependency-based embeddings performing best .
|
W05-0627 |
Only the constituents with the largest probability among
embedding
ones are kept .
|
J95-2003 |
At the level of linguistic structure , discourses divide into constituent discourse segments ; an
embedding
relationship may hold between two segments .
|
W14-4002 |
Inspired by work on parsing ( Klein and Man - ning , 2003 ) , we explore a vertical Markovian labeling approach : intuitively , 0th-order labels signify the reordering of the sub-phrases inside the phrase pair ( Zhang et al. , 2008 ) , 1st-order labels signify reordering aspects of the direct context ( an
embedding
, parent phrase pair ) of the phrase pair , and so on .
|
E89-1033 |
An incremental chart parser embodying the ideas put forward in this paper has been implemented , and an
embedding
of this in an interactive parsing system is near completion .
|