D09-1049 have experimented with additional feature combinations , with no improvement in results
D09-1056 shows this correlation for several features combinations . This is an indication that
D08-1049 adding character n-grams to other feature combinations always gives significant improvements
C86-1140 nucleus class . Of all possible feature combinations , the one with the highest consistency
D09-1160 classifiers and their ways to consider feature combinations . In classification-based NLP
D09-1107 extends easily to cover non-linear feature combinations . The reason for using amoeba
D09-1056 coreferent pairs lower . In the case of feature combinations we can study them by training
D08-1024 Marton and Resnik 's XP = and XP ' feature combinations , respectively . Fine-grained
C04-1088 classification accuracy below the other feature combinations . In combination , though , these
D09-1056 next section . 4.3 Analysis of feature combinations Up to now we have analysed the
D08-1087 development set are starred . 4.5 Feature Combination Unlike most previous work , n-gram
D09-1111 opportunity to improve through system or feature combination : an oracle that perfectly selects
C04-1166 Passages : Exploring Linguistic Feature Combinations via Machine Learning . In Proceedings
D08-1024 many larger , potentially better feature combinations remained unexplored . Moreover
D09-1056 features . First , we use each feature combinations as the input for a Machine Learning
D08-1024 features and manually selected feature combinations each in a separate model . Although
C90-3030 there a reading with any of the feature combinations defining finite verbs . The condition
D08-1087 roughly half for a variety of feature combinations . However , all improvements
C04-1190 by exploiting the similarity to feature combinations that have been seen . However
D09-1032 ) = w N ! n1!n2 ! ... nr ! 4.4 Feature combination using linear regression We also
hide detail