E06-3005 at creating our own paragraph selection technique . More interesting than paragraph
D09-1094 overfitting , we used a common feature selection technique of eliminating words appearing
D10-1061 selection outperformed all other selection techniques across all iterations of the
D14-1058 verbs using E1 regularized feature selection technique . We select a small set of seed
D10-1061 in target hypotheses ) for each selection technique . Here , discriminative sample
D08-1029 resolvable with string match . The selection techniques were unknown to ACE participants
D10-1061 numerous competing active and passive selection techniques . Our approach hinges on a maximum-entropy
E09-1019 approach The other proposed query selection technique ( i ) allows for an automatic
C02-1047 . The MML principle is a model selection technique which applies information-theoretic
D12-1031 use this heuristic as a feature selection technique , discarding such features .
D10-1061 coupled with a greedy , incremental selection technique that discourages redundancy in
D09-1119 believe better motivated feature selection technique taking into account linguistic
D14-1058 find that in practice using this selection technique leads to better performance than
D14-1104 tried Jiang and Zhai 's subset selection technique ( § 3.1 in Jiang and Zhai
D11-1065 - tion . We find that the best selection technique is the recently proposed cross-entropy
D10-1061 proposed discriminative sample selection technique performs significantly better
E09-1082 an upper bound for the output selection technique . 2.2 Output Combination Consensus
D09-1012 comprehensive overview of feature selection techniques is carried out in ( Guyon and
D13-1125 from the application of feature selection techniques especially when there are highly
D13-1011 investigate alternative training data selection techniques , disfluency handling strategies
hide detail