D12-1053 then be used in the next step for feature generation . This is done by means of a
D12-1002 character space . An example for feature generation is shown in Sec . 3 . After converting
D11-1141 these classifiers are used in feature generation for named entity recognition
D10-1095 for CRF implementa - tion . For feature generation we used a combination of standard
D12-1045 this distinction later during feature generation ( Section 5 ) . To compare event
A00-2017 and verb . As a final comment on feature generation , we note that the language presented
D10-1110 leverage temporal information . 3 Feature Generation To better understand our work
D08-1001 Huber et al. ( 2006 ) ) . 4.3 Feature Generation The annotation discussed above
D12-1011 ri counts how many Algorithm : Feature Generation Input : DB , a database in BCNF
D10-1095 summarized in Table 1 . We followed the feature generation process of ( Sha and Pereira
A00-2017 between the two words . In our feature generation language we separate the information
D14-1082 table , and thus greatly reduces feature generation time . Instead , it involves
D08-1013 analysis task , the process of feature generation will be presented . 4 * 1 Adding
D09-1053 , LambdaSMART can be used as a feature generation method . LambdaSMART is arguably
D13-1042 a secondary cause for ( word ) feature generation , supplementing and smoothing
D14-1066 features , we use UMLS solely for feature generation . 4.3 Google Web1T We use the
D08-1098 the increased complexity of the feature generation . Finally , combining question
D13-1041 generated by local learner for global feature generation , while we search the top k candidates
C00-2156 grapheme-to-iflloneme conversion and prosodic feature generation . More - over , gral ) helnes
D12-1011 the database with key id , the feature generation algorithm generates two types
hide detail