W13-3509 approach was based on a naive hard matching between word lemmas . Below we
W09-0204 In addi - tion , the problem of hard matching can be alleviated by processing
S13-1022 method as the NLPM models . Here , hard matching was performed , where matching
P09-1106 Another reason is that TER uses hard matching in computing edit distance .
C04-1078 that is not matched by any of the hard matching pattern rules . Only those fields
C04-1078 bootstrapping framework in which soft and hard matching pattern rules are combined in
D15-1210 refer to its loss function as hard matching : AHM ( a1 , a2 ) = 1 −
P07-1026 Tree Path feature is sensitive to hard matching between any two sub-trees without
P07-1026 However , it only carries out hard matching , which may lead to over-fitting
C04-1078 variations . The reason is that hard matching techniques result in relatively
D15-1210 developing a soft version of the hard matching loss function because this will
P03-2040 alignment , if we do more than hard matching of punctuations and take into
D15-1210 BERKELEY BLEU . " HM " denotes the hard matching loss function , " SM " denotes
E95-1010 of byte-length ratio measures , hard matching of numbers , string comparisons
P07-1026 convolution tree kernel only carries out hard matching , so it fails to handle similar
C04-1078 induction techniques based on hard matching ( i.e. , strict slot-by-slot
P06-2074 Xiao et al. ( 2004 ) stated that hard matching techniques tend to have low recall
C04-1078 <title> Cascading Use of Soft and Hard Matching Pattern Rules for Weakly Supervised
hide detail