W13-0705 the computation of the minimum embedding /embedding/NN space , the event locus , for
W13-3207 introduce a new 50-dimensional embedding /embedding/NN obtained by spectral clustering
W13-3207 structure of the lexicon . We use the embedding /embedding/NN directly to investigate sets
W13-3213 or not , by classifying its RNN embedding /embedding/NN together with those of its siblings
W14-1411 for the adjectival challenge by embedding /embed/VBG the record types defined to deal
W14-4002 two nonterminal gaps , thereby embedding /embed/VBG ITG permutations ( Wu , 1997
W14-4002 aspects of the direct context ( an embedding /embedding/NN , parent phrase pair ) of the
W14-5201 pipelines with other researchers , embedding /embed/VBG NLP pipelines in applications
W15-1303 relies on the notion of semantic embedding /embedding/NN and a fine-grained classification
W15-1501 on the popular skip-gram word embedding /embedding/NN model . The novelty of our approach
W15-1504 We introduce a new method for embedding /embed/VBG word instances and their context
W15-1504 The method , Instance-context embedding /embedding/NN ( ICE ) , leverages neural word
W15-1511 1992 ) or the ( less well-known ) embedding /embedding/NN form given by the canonical correlation
W15-2608 based approach that uses word embedding /embedding/NN features to recognize drug names
W15-2619 rank synonym candidates with word embedding /embedding/NN and pseudo-relevance feedback
W15-2619 PRF-based reranking outperformed word embedding /embedding/NN based approach and a strong baseline
W15-3041 and a feature produced with word embedding /embedding/NN models ( SHEF - QuEst + + ) .
W15-3124 sen - timental features , word embedding /embedding/NN is employed for acquiring expanded
W15-3814 literature . Recent advances in word embedding /embedding/NN make computation of word distribution
W15-3814 extraction by using the latest word embedding /embedding/NN methods . By using bag-ofwords
hide detail