Negative filter
JJ, NN, NNS 12
(384.2 per million)
other,39-2-P06-2110,ak
<term>
taxonomic similarity
</term>
and
<term>
associative similarity
</term>
. The result of the comparison was
#12470Through two experiments, three methods for constructing word vectors, i.e., LSA-based, cooccurrence-based and dictionary-based methods, were compared in terms of the ability to represent two kinds of similarity, i.e., taxonomic similarity andassociative similarity.
other,26-3-P06-2110,ak
word vectors
</term>
better reflect
<term>
associative similarity
</term>
. We investigate
<term>
independent
#12499The result of the comparison was that the dictionary-based word vectors better reflect taxonomic similarity, while the LSA-based and the cooccurrence-based word vectors better reflectassociative similarity.
tech,13-2-P06-2110,ak
constructing
<term>
word vectors
</term>
, i.e. ,
<term>
LSA-based , cooccurrence-based and dictionary-based methods
</term>
, were compared in terms of the ability
#12444Through two experiments, three methods for constructing word vectors, i.e.,LSA-based , cooccurrence-based and dictionary-based methods, were compared in terms of the ability to represent two kinds of similarity, i.e., taxonomic similarity and associative similarity.
#12491The result of the comparison was that the dictionary-based word vectors better reflect taxonomic similarity, while theLSA-based and the cooccurrence-based word vectors better reflect associative similarity.
other,6-1-P06-2110,ak
. This paper examines what kind of
<term>
similarity
</term>
between
<term>
words
</term>
can be represented
#12413This paper examines what kind ofsimilarity between words can be represented by what kind of word vectors in the vector space model.
other,32-2-P06-2110,ak
ability to represent two kinds of
<term>
similarity
</term>
, i.e. ,
<term>
taxonomic similarity
#12463Through two experiments, three methods for constructing word vectors, i.e., LSA-based, cooccurrence-based and dictionary-based methods, were compared in terms of the ability to represent two kinds ofsimilarity, i.e., taxonomic similarity and associative similarity.
other,36-2-P06-2110,ak
of
<term>
similarity
</term>
, i.e. ,
<term>
taxonomic similarity
</term>
and
<term>
associative similarity
</term>
#12467Through two experiments, three methods for constructing word vectors, i.e., LSA-based, cooccurrence-based and dictionary-based methods, were compared in terms of the ability to represent two kinds of similarity, i.e.,taxonomic similarity and associative similarity.
other,13-3-P06-2110,ak
word vectors
</term>
better reflect
<term>
taxonomic similarity
</term>
, while the
<term>
LSA-based and the
#12486The result of the comparison was that the dictionary-based word vectors better reflecttaxonomic similarity, while the LSA-based and the cooccurrence-based word vectors better reflect associative similarity.
tech,20-1-P06-2110,ak
of
<term>
word vectors
</term>
in the
<term>
vector space model
</term>
. Through two experiments , three
#12427This paper examines what kind of similarity between words can be represented by what kind of word vectors in thevector space model.
model,16-1-P06-2110,ak
can be represented by what kind of
<term>
word vectors
</term>
in the
<term>
vector space model
</term>
#12423This paper examines what kind of similarity between words can be represented by what kind ofword vectors in the vector space model.
model,8-2-P06-2110,ak
experiments , three methods for constructing
<term>
word vectors
</term>
, i.e. ,
<term>
LSA-based , cooccurrence-based
#12439Through two experiments, three methods for constructingword vectors, i.e., LSA-based, cooccurrence-based and dictionary-based methods, were compared in terms of the ability to represent two kinds of similarity, i.e., taxonomic similarity and associative similarity.
other,8-1-P06-2110,ak
kind of
<term>
similarity
</term>
between
<term>
words
</term>
can be represented by what kind of
#12415This paper examines what kind of similarity betweenwords can be represented by what kind of word vectors in the vector space model.