other,10-3-J05-1003,bq attempts to improve upon this initial <term> ranking </term> , using additional <term> features </term>
tech,3-6-J05-1003,bq al. ( 1998 ) </term> . We apply the <term> boosting method </term> to <term> parsing </term> the <term> Wall
tech,13-5-J05-1003,bq reranking task </term> , based on the <term> boosting approach </term> to <term> ranking problems </term> described
measure(ment),6-8-J05-1003,bq <term> model </term> achieved 89.75 % <term> F-measure </term> , a 13 % relative decrease in <term>
tech,2-2-J05-1003,bq probabilistic parser </term> . The base <term> parser </term> produces a set of <term> candidate
other,17-3-J05-1003,bq additional <term> features </term> of the <term> tree </term> as evidence . The strength of our
other,25-7-J05-1003,bq additional 500,000 <term> features </term> over <term> parse trees </term> that were not included in the original
tech,11-1-J05-1003,bq which rerank the output of an existing <term> probabilistic parser </term> . The base <term> parser </term> produces
tech,8-12-J05-1003,bq experiments in this article are on <term> natural language parsing ( NLP ) </term> , the <term> approach </term> should
tech,8-10-J05-1003,bq significant efficiency gains for the new <term> algorithm </term> over the obvious <term> implementation
other,23-12-J05-1003,bq should be applicable to many other <term> NLP problems </term> which are naturally framed as <term>
model,7-7-J05-1003,bq <term> log-likelihood </term> under a <term> baseline model </term> ( that of <term> Collins [ 1999 ] </term>
tech,43-12-J05-1003,bq <term> machine translation </term> , or <term> natural language generation </term> . We present a novel <term> method </term>
hide detail