other,26-3-P06-2110,ak word vectors </term> better reflect <term> associative similarity </term> . We investigate <term> independent
other,39-2-P06-2110,ak <term> taxonomic similarity </term> and <term> associative similarity </term> . The result of the comparison was
model,8-3-P06-2110,ak result of the comparison was that the <term> dictionary-based word vectors </term> better reflect <term> taxonomic similarity
tech,13-2-P06-2110,ak constructing <term> word vectors </term> , i.e. , <term> LSA-based , cooccurrence-based and dictionary-based methods </term> , were compared in terms of the ability
model,18-3-P06-2110,ak taxonomic similarity </term> , while the <term> LSA-based and the cooccurrence-based word vectors </term> better reflect <term> associative similarity
other,32-2-P06-2110,ak ability to represent two kinds of <term> similarity </term> , i.e. , <term> taxonomic similarity
other,6-1-P06-2110,ak . This paper examines what kind of <term> similarity </term> between <term> words </term> can be represented
other,13-3-P06-2110,ak word vectors </term> better reflect <term> taxonomic similarity </term> , while the <term> LSA-based and the
other,36-2-P06-2110,ak of <term> similarity </term> , i.e. , <term> taxonomic similarity </term> and <term> associative similarity </term>
tech,20-1-P06-2110,ak of <term> word vectors </term> in the <term> vector space model </term> . Through two experiments , three
model,16-1-P06-2110,ak can be represented by what kind of <term> word vectors </term> in the <term> vector space model </term>
model,8-2-P06-2110,ak experiments , three methods for constructing <term> word vectors </term> , i.e. , <term> LSA-based , cooccurrence-based
other,8-1-P06-2110,ak kind of <term> similarity </term> between <term> words </term> can be represented by what kind of
hide detail