C94-1049 their vectors ( or actually the inner product of their normalized vectors )
C04-1059 calculated proportional to the inner product of the two vectors . All sentences
C04-1064 space . WSK 's value is just the inner product of the two vectors . For instance
D08-1112 Nbest labelings . Second , the inner product in the ratio calculation ( 4
D11-1102 that SVM training depends on the inner product between instances . Kernel methods
D09-1163 distance is normally the cosine or inner product measure between two vectors .
C04-1070 kernel function that expresses the inner product between two examples in the desired
D11-1023 stream such as point , range and inner product queries to be approximately answered
D09-1084 queries is then defined as the inner product between the corresponding centroid
D09-1142 implicitly , that is by specifying the inner product between each pair of points rather
D11-1058 operator ( · ) denotes the inner product with respect to the subscripts
C02-1053 Vector where " * " represents the inner product . In general , such a hyperplane
D08-1114 L2 norm , and hxi , xji is the inner product of xi and xj . The first similarity
A97-1043 of Pi and Pi is measured by the inner product of their normalised vectors and
D09-1142 where < . , . > denotes the inner product in F . In the case of binary
C04-1070 ( vectorxi , vectorz ) is the inner product between the example vectors .
D11-1058 dot operators here stand for the inner product with respect to the index sets
C02-1086 affect positively to the vector inner product value . Therefore , since this
C02-1072 term-toterm similarity is based on the inner products between two row vectors of A
C02-1086 document is calculated by vector inner product of the query and document vectors
hide detail