S13-2017 |
model , in combination with latent
|
vector weighting
|
. The system computes the similarity
|
S13-2017 |
with a technique called latent
|
vector weighting
|
. The system computes the similarity
|
J09-3004 |
improving the quality offeature
|
vector weighting
|
in distributional word similarity
|
S13-2017 |
that our model based on latent
|
vector weighting
|
performs quite a bit better than
|
S13-2017 |
Methodology Our method uses latent
|
vector weighting
|
( Van de Cruys et al. , 2011
|
S13-2017 |
and our model that uses latent
|
vector weighting
|
. The results indicate that our
|
W14-1502 |
( i.e. be - fore , or after ,
|
vector weighting
|
) . Table 7 shows the average
|
D11-1094 |
density as well . <title> Latent
|
Vector Weighting
|
for Word Meaning in Context </title>
|
P04-1080 |
tested CGDterm using various word
|
vector weighting
|
methods when deriving context
|
P14-1023 |
that motivates traditional count
|
vector weighting
|
measures such as PMI ) . This
|
S13-2007 |
pus , in combination with Latent
|
Vector Weighting
|
( Van de Cruys et al. , 2011
|
S13-2017 |
and Compositional using Latent
|
Vector Weighting
|
</title> Tim Van_de_Cruys Stergos
|
E14-1025 |
word , wi and context word , cj .
|
Vector Weighting
|
We used the tTest and PPMI weighting
|