. In this presentation , we describe the features of and requirements for a genuinely useful
called a <term> semantic frame </term> . The key features of the <term> system </term> include : ( i
other,7-2-P01-1070,ak which are built from <term> shallow linguistic features </term> of questions , are employed to predict
other,36-1-N03-1033,ak </term> , ( ii ) broad use of <term> lexical features </term> , including jointly conditioning
other,66-1-N03-1033,ak fine-grained modeling of <term> unknown word features </term> . Using these ideas together , the
other,12-3-P03-1002,ak based on : ( 1 ) an extended set of <term> features </term> ; and ( 2 ) <term> inductive decision
other,5-3-P03-1022,ak non-NP-antecedents </term> . We present a set of <term> features </term> designed for <term> pronoun resolution
other,18-3-P03-1022,ak </term> and determine the most promising <term> features </term> . We evaluate the system on twenty
techniques </term> are able to produce useful features for <term> paraphrase classification </term>
other,14-3-J05-1003,ak <term> ranking </term> , using additional <term> features </term> of the <term> tree </term> as evidence
other,19-4-J05-1003,ak represented as an arbitrary set of <term> features </term> , without concerns about how these
other,26-4-J05-1003,ak , without concerns about how these <term> features </term> interact or overlap and without the
other,45-4-J05-1003,ak generative model </term> which takes these <term> features </term> into account . We introduce a new
other,23-7-J05-1003,ak evidence from an additional 500,000 <term> features </term> over <term> parse trees </term> that
other,19-9-J05-1003,ak of the <term> sparsity </term> of the <term> feature space </term> in the <term> parsing data </term>
tech,21-11-J05-1003,ak simplicity and efficiency — to work on <term> feature selection methods </term> within <term> log-linear
tech,40-5-P05-1010,ak </term> created using extensive <term> manual feature selection </term> . This paper considers
other,11-6-P05-1053,ak effective incorporation of diverse <term> features </term> enables our system outperform previously
tech,6-2-P05-1057,ak knowledge sources </term> are treated as <term> feature functions </term> , which depend on the <term>
other,20-4-P05-1057,ak bilingual dictionary coverage </term> as <term> features </term> . Our experiments show that <term>
hide detail