D15-1010 similarity score vectors . We use matrix multiplication to calculate the similarities
E06-1013 training process . By avoiding matrix multiplication , data of high dimensionality
D11-1016 matrix-space framework with iterated matrix multiplication defines an elegant framework
D15-1167 word as a matrix and use iterated matrix multiplication as phrase-level composition function
D14-1111 vector argument , corresponds to matrix multiplication . Figure 1 shows how the syntactic
D14-1111 multi-linear generalisation of matrix multiplication ( Grefenstette , 2013 ) . Consider
D11-1016 matrices since for affine matrices matrix multiplication represents both operations :
D14-1111 for a transitive verb sentence matrix multiplication , to give a sentence vector .
D14-1082 chosen from Sw , we pre-compute matrix multiplications for most top frequent 10 , 000
E14-1025 create the RI vectors through matrix multiplication rather than sequentially . We
D11-1016 word order into account , since matrix multiplication is not commutative operation
D12-1110 matrix mul - tiplication . Since matrix multiplication is associa - tive , this can
E97-1002 Abstract Valiant showed that Boolean matrix multiplication ( BMM ) can be used for CFG parsing
D13-1144 algorithm to compute it as a simple matrix multiplication formulation . 3.1 Kernel description
D13-1140 , allowing the use of matrix - matrix multiplications instead of matrix-vector multiplications
D13-1166 more than the usual notion of matrix multiplication between a matrix and a vec -
D13-1195 of the GPU and similar to dense matrix multiplication which achieves the devices highest
D11-1016 and combine words using iterated matrix multiplication , which allows for the modeling
E14-1033 graph can be reduced to a boolean matrix multiplication . For each temporal relation
D15-1026 D · C 2We only count the matrix multiplication operations , as they take the
hide detail