W06-1632 development data and TiMBL . The feature subset selection algorithm performs a hill-climbing
N06-1025 using the development data . The feature subset selection algorithm performs a hill-climbing
W06-3401 method . Feature normalization and feature subset selection using recursive feature elimination
N04-4002 recently addressed the issue of feature subset selection -LSB- 2 -RSB- -LSB- 4 -RSB- -LSB-
D08-1081 liblinear toolkit 1 implements simple feature subset selection based on the F statistic ( Chen
J12-4003 we performed floating forward feature subset selection using only the information contained
H05-1027 millions of candidate features . 3.4 Feature Subset Selection This section describes our method
W02-2030 features for semantic trees makes the feature subset selection method less applicable since
P04-1080 , 1 < i < N. ( 1 ) j The feature subset selection in word set W can be formulated
W13-3616 apply a genetic algorithm to do feature subset selection in order to reduce dimensionality
W02-1410 entire words themselves ) , thus no feature subset selection was performed . ( Wiebe , 2000
C02-1130 classifications in different contexts . with feature subset selection ( Kohavi & Sommerfield ,
W13-2242 these learning models after a feature subset selection with recursive feature elimination
W14-3339 these learning models after a feature subset selection with recursive feature elimination
S14-2085 these learning models after a feature subset selection with recursive feature elimination
I05-2045 rankings of the features . 2.2 Feature Subset Selection and Model Order Identification
W15-4924 the features . Correlation-based Feature Subset Selection for Machine Learning by M. A.
D15-1303 on the cyclic correlation-based feature subset selection ( CFS ) and another based on
J01-1002 Tree for the Prosody-Only Model Feature subset selection was run with an initial set of
D09-1140 toolkit2 ( Fan et al. , 2008 ) . Feature subset selection is carried out by calculating
hide detail