C02-1025 system in Table 3 refers to the maximum entropy system that uses only local features
W02-1116 for both the baseline and the maximum entropy system . It can be seen that the MaxEnt
W13-1733 first describe in Section 2 our maximum entropy system used for the classification of
J01-2002 system ( TnT ) is better than the maximum entropy system ( MXPOST ) . On the other two
W04-0862 results : the output of their maximum entropy system and their boosting system . These
W11-3214 system is a combination of our maximum entropy system and Moses . Our training and
S07-1005 University of Melbourne used a maximum entropy system using a wide variety of syntactic
W02-0813 NLP ( Ratnaparkhi , 1998 ) . Our maximum entropy system performed competitively with
P01-1041 positive examples for the rule . 2.4 Maximum entropy system In order to compare our method
W11-3201 results show that the combination of maximum entropy system with Sequitur g2p tool and Moses
P15-4009 Jurafsky 2002 ) . Then the baseline maximum entropy system is developed using these features
C02-1143 as beneficial for Chinese . The maximum entropy system performed competitively with
W13-1733 conclusion We have submitted on maximum entropy system to the shared task on Native
W11-3214 show that the combination of our maximum entropy system with Sequitur g2p tool and Moses
P01-1041 performance is comparable to that of the maximum entropy system , and it can be trained more
W04-0845 Like the boosting model , the maximum entropy system also participated in the Senseval-3
W04-2807 with their frameset is the same maximum entropy system as that of Dang and Palmer (
C02-1143 ( 4 ) 2.1 English Results The maximum entropy system 's performance on the verbs from
W98-1118 three essential components of a maximum entropy system : histories , futures , and features
W98-1118 problem of N.E. recognition with the maximum entropy system acting as an ar - bitrator .
hide detail