W03-2200 directions for improving MT by embedding it in an environment of other
W05-0627 the largest probability among embedding ones are kept . After predicting
W06-0603 marker presence and syntactic embedding structure to be strongly associated
W06-2205 generated from a text corpus by embedding syntactically parsed sentences
W07-1022 Understanding the structure of the embedding phrase can be an enormously beneficial
W10-2802 ularities . This latter is employed by embedding prior FrameNet-derived knowledge
W10-4104 other examples of this section are embedding , while this example is of overlapping
W11-1827 that is based on the notion of embedding . We apply our methodology to
W13-0705 what we refer to as its minimum embedding space . The focus here will be
W13-3207 introduce a new 50-dimensional embedding obtained by spectral clustering
W13-3213 or not , by classifying its RNN embedding together with those of its siblings
W14-1411 for the adjectival challenge by embedding the record types defined to deal
W14-4002 two nonterminal gaps , thereby embedding ITG permutations ( Wu , 1997
W14-5201 pipelines with other researchers , embedding NLP pipelines in applications
W15-1303 relies on the notion of semantic embedding and a fine-grained classification
W15-1501 on the popular skip-gram word embedding model . The novelty of our approach
W15-1504 We introduce a new method for embedding word instances and their context
W15-1511 1992 ) or the ( less well-known ) embedding form given by the canonical correlation
W15-2608 based approach that uses word embedding features to recognize drug names
W15-2619 rank synonym candidates with word embedding and pseudo-relevance feedback
hide detail