other,7-2-P01-1070,bq These <term> models </term> , which are built from <term> shallow linguistic features </term> of <term> questions </term> , are employed to predict target variables which represent a <term> user 's informational goals </term> .
tech,4-2-N03-1026,bq Our <term> system </term> incorporates a <term> linguistic parser/generator </term> for <term> LFG </term> , a <term> transfer component </term> for <term> parse reduction </term> operating on <term> packed parse forests </term> , and a <term> maximum-entropy model </term> for <term> stochastic output selection </term> .
other,27-1-C04-1112,bq In this paper , we present a <term> corpus-based supervised word sense disambiguation ( WSD ) system </term> for <term> Dutch </term> which combines <term> statistical classification ( maximum entropy ) </term> with <term> linguistic information </term> .
other,12-3-N04-1022,bq We describe a hierarchy of <term> loss functions </term> that incorporate different levels of <term> linguistic information </term> from <term> word strings </term> , <term> word-to-word alignments </term> from an <term> MT system </term> , and <term> syntactic structure </term> from <term> parse-trees </term> of <term> source and target language sentences </term> .
We describe an efficient <term> decoder </term> and show that using these <term> tree-based models </term> in combination with conventional <term> SMT models </term> provides a promising approach that incorporates the power of <term> phrasal SMT </term> with the linguistic generality available in a <term> parser </term> .
other,12-3-P06-2059,bq The idea behind our method is to utilize certain <term> layout structures </term> and <term> linguistic pattern </term> .
other,6-2-C86-1132,bq <term> RAREAS </term> draws on several kinds of <term> linguistic and non-linguistic knowledge </term> and mirrors a forecaster 's apparent tendency to ascribe less precise <term> temporal adverbs </term> to more remote meteorological events .
other,25-2-J86-3001,bq In this theory , <term> discourse structure </term> is composed of three separate but interrelated components : the structure of the sequence of <term> utterances </term> ( called the <term> linguistic structure </term> ) , a structure of <term> purposes </term> ( called the <term> intentional structure </term> ) , and the state of <term> focus of attention </term> ( called the <term> attentional state </term> ) .
other,1-3-J86-3001,bq The <term> linguistic structure </term> consists of segments of the <term> discourse </term> into which the <term> utterances </term> naturally aggregate .
other,13-4-J86-3001,bq The <term> intentional structure </term> captures the <term> discourse-relevant purposes </term> , expressed in each of the <term> linguistic segments </term> as well as relationships among them .
other,8-3-P86-1011,bq We then turn to a discussion comparing the <term> linguistic expressiveness </term> of the two <term> formalisms </term> .
other,11-1-P86-1038,bq <term> Unification-based grammar formalisms </term> use structures containing sets of <term> features </term> to describe <term> linguistic objects </term> .
other,21-2-C88-2160,bq The explanation of an <term> ambiguity </term> or an error for the purposes of correction does not use any concepts of the underlying <term> linguistic theory </term> : it is a reformulation of the erroneous or ambiguous <term> sentence </term> .
other,17-2-C88-2162,bq For one thing , <term> learning methodology </term> applicable in <term> general domains </term> does not readily lend itself in the <term> linguistic domain </term> .
other,3-3-C88-2162,bq For another , <term> linguistic representation </term> used by <term> language processing systems </term> is not geared to <term> learning </term> .
other,4-4-C88-2162,bq We introduced a new <term> linguistic representation </term> , the <term> Dynamic Hierarchical Phrasal Lexicon ( DHPL ) </term> [ Zernik88 ] , to facilitate <term> language acquisition </term> .
other,3-7-C88-2162,bq First , how <term> linguistic concepts </term> are acquired from <term> training examples </term> and organized in a <term> hierarchy </term> ; this task was discussed in previous papers [ Zernik87 ] .
other,16-8-C88-2162,bq Second , we show in this paper how a <term> lexical hierarchy </term> is used in predicting new <term> linguistic concepts </term> .
tech,12-4-C90-3063,bq The scheme was implemented by gathering <term> statistics </term> on the output of other <term> linguistic tools </term> .
other,27-2-J90-3002,bq The basic goal in building that <term> editor </term> was to provide an adequate tool to help <term> lexicologists </term> produce a valid and coherent <term> dictionary </term> on the basis of a <term> linguistic theory </term> .
hide detail