J13-1008 |
Last , we include an appendix with further explorations of PERSON feature engineering , " binning " of Arabic number constructions according to their complex syntactic patterns , and
embedding
useful morphological features in the POS tag set .
|
D15-1031 |
We study the problem of jointly
embedding
a knowledge base and a text corpus .
|
C82-2022 |
- I01 - 1.2 , Another problem is proposed by the structural ambi - 8 ~ ity of sequences of PP 's , for which both the "
embedding
" and the " same-level " hypotheses are presented !
|
D15-1191 |
We consider the problem of
embedding
knowledge graphs ( KGs ) into continuous vector spaces .
|
W05-0627 |
Only the constituents with the largest probability among
embedding
ones are kept .
|
P14-1011 |
After training , the model learns how to embed each phrase semantically in two languages and also learns how to transform semantic
embedding
space in one language to the other .
|
C86-1088 |
The
embedding
rule for = > - conditions requires , roughly speaking , that every proper embedding for the antecedent sub-DRS can be properly extended to the consequent sub-DRS .
|
J10-3010 |
Our extrinsic evaluation is done by
embedding
the expansion systems into a real-world search engine , and comparing the two systems based on the search results that are triggered by the respective query expansions .
|
P15-1107 |
We introduce an LSTM model that hierarchically builds an embedding for a paragraph from embeddings for sentences and words , then decodes this
embedding
to reconstruct the original paragraph .
|
H91-1024 |
When one is writing to one 's superiors , there is , for example , much more
embedding
of requests in hypotheticals .
|
J80-1001 |
ATN 's have the advantage of being a class of automata into which ordinary context-free phrase structure and " augmented " phrase structure grammars have a straightforward
embedding
, but which permit various transformations to be performed to produce grammars that can be more efficient than the original .
|
D14-1062 |
By
embedding
our latent domain phrase model in a sentence-level model and training the two in tandem , we are able to adapt all core translation components together -- phrase , lexical and reordering .
|
J09-1002 |
Following these TransType ideas , the innovative
embedding
proposed here consists in using a complete MT system to produce full target sentence hypotheses , or portions thereof , which can be accepted or amended by a human translator .
|
C80-1009 |
As he points out , this is a simple and natural way of treating any construction capable of unlimited
embedding
.
|
D15-1205 |
Compositional
embedding
models build a representation ( or embedding ) for a linguistic structure based on its component word embeddings .
|
D15-1036 |
Different evaluations result in different orderings of
embedding
methods , calling into question the common assumption that there is one single optimal vector representation .
|
D15-1054 |
We identify several potential weaknesses of the plain application of conventional word
embedding
methodologies for ad click prediction .
|
P15-1125 |
The novel
embedding
model associates each category node of the hierarchy with a distance metric .
|
P15-1009 |
The key idea of SSE is to take full advantage of additional semantic information and enforce the embedding space to be semantically smooth , i.e. , entities belonging to the same semantic category will lie close to each other in the
embedding
space .
|
J82-3001 |
The intuitive notion of "
embedding
a linguistic theory into a model of language use " as it is generally construed is much stronger than this , since it implies that the parsing system follows some ( perhaps all ) of the same operating principles as the linguistic system , and makes reference in its operation to the same system of rules .
|