C02-1040 incorporated into the DFA in the feature selection step . 4.2 Obtaining the frequency
C00-1064 useful features are extended by feature selection . Tile retrieved features and
A97-1056 of the evaluation criteria in feature selection . During both BSS and FSS it
A94-1027 terms by using stop words list or feature selection method , etc. . The number of
C02-1054 We also presented an SVM-based feature selection method that removed 75 % of features
C00-1064 using maximum entropy modeling and feature selection concept . We devised a nlodel
C00-1064 considered an initial supervision and feature selection . Our model starts with initial
C02-1054 method ` XQK - FS ' ( XQK with Feature Selection ) . This approximation slightly
A00-2026 maximum length M ' = 10 . 3.3.2 Feature Selection The feature patterns for NLG3
A00-1021 GuruQA ) are shown in Table 2 . 5.1 Feature selection The seven span features described
C02-1025 can be used for the same token . Feature selection is implemented using a feature
A97-1056 model selection also performs feature selection . If a model is selected where
C02-1054 . We also present an SVM-based feature selection method and an efficient training
C02-1054 works . We can use this fact for feature selection after the training . We simplify
A00-2018 maximum-entropy model this is done by feature selection , as in Ratnaparkhi 's maximum-entropy
C00-1064 mappings are obtaind by automatic feature selection based on the maximum entropy
C02-1020 features for learning a model , but feature selection for ME modeling is more di cult
A00-2026 as the values of the search and feature selection parameters for all systems ,
C00-1064 scaled up . <title> Structural Feature Selection For English-Korean Statistical
A97-1056 to lead to high accuracy . If feature selection is not in doubt ( i.e. , it is
hide detail