other,6-1-P06-2110,ak This paper examines what kind of <term> similarity </term> between <term> words </term> can be represented by what kind of <term> word vectors </term> in the <term> vector space model </term> .
other,8-1-P06-2110,ak This paper examines what kind of <term> similarity </term> between <term> words </term> can be represented by what kind of <term> word vectors </term> in the <term> vector space model </term> .
model,16-1-P06-2110,ak This paper examines what kind of <term> similarity </term> between <term> words </term> can be represented by what kind of <term> word vectors </term> in the <term> vector space model </term> .
tech,20-1-P06-2110,ak This paper examines what kind of <term> similarity </term> between <term> words </term> can be represented by what kind of <term> word vectors </term> in the <term> vector space model </term> .
model,8-2-P06-2110,ak Through two experiments , three methods for constructing <term> word vectors </term> , i.e. , <term> LSA-based , cooccurrence-based and dictionary-based methods </term> , were compared in terms of the ability to represent two kinds of <term> similarity </term> , i.e. , <term> taxonomic similarity </term> and <term> associative similarity </term> .
tech,13-2-P06-2110,ak Through two experiments , three methods for constructing <term> word vectors </term> , i.e. , <term> LSA-based , cooccurrence-based and dictionary-based methods </term> , were compared in terms of the ability to represent two kinds of <term> similarity </term> , i.e. , <term> taxonomic similarity </term> and <term> associative similarity </term> .
other,32-2-P06-2110,ak Through two experiments , three methods for constructing <term> word vectors </term> , i.e. , <term> LSA-based , cooccurrence-based and dictionary-based methods </term> , were compared in terms of the ability to represent two kinds of <term> similarity </term> , i.e. , <term> taxonomic similarity </term> and <term> associative similarity </term> .
other,36-2-P06-2110,ak Through two experiments , three methods for constructing <term> word vectors </term> , i.e. , <term> LSA-based , cooccurrence-based and dictionary-based methods </term> , were compared in terms of the ability to represent two kinds of <term> similarity </term> , i.e. , <term> taxonomic similarity </term> and <term> associative similarity </term> .
other,39-2-P06-2110,ak Through two experiments , three methods for constructing <term> word vectors </term> , i.e. , <term> LSA-based , cooccurrence-based and dictionary-based methods </term> , were compared in terms of the ability to represent two kinds of <term> similarity </term> , i.e. , <term> taxonomic similarity </term> and <term> associative similarity </term> .
model,8-3-P06-2110,ak The result of the comparison was that the <term> dictionary-based word vectors </term> better reflect <term> taxonomic similarity </term> , while the <term> LSA-based and the cooccurrence-based word vectors </term> better reflect <term> associative similarity </term> .
other,13-3-P06-2110,ak The result of the comparison was that the <term> dictionary-based word vectors </term> better reflect <term> taxonomic similarity </term> , while the <term> LSA-based and the cooccurrence-based word vectors </term> better reflect <term> associative similarity </term> .
model,18-3-P06-2110,ak The result of the comparison was that the <term> dictionary-based word vectors </term> better reflect <term> taxonomic similarity </term> , while the <term> LSA-based and the cooccurrence-based word vectors </term> better reflect <term> associative similarity </term> .
other,26-3-P06-2110,ak The result of the comparison was that the <term> dictionary-based word vectors </term> better reflect <term> taxonomic similarity </term> , while the <term> LSA-based and the cooccurrence-based word vectors </term> better reflect <term> associative similarity </term> .
hide detail