</term> is ubiquitous and carries important information yet it is also time consuming to document
question is , however , how an interesting information piece would be found in a <term> large database
tech,1-4-H01-1001,bq large database </term> . Traditional <term> information retrieval techniques </term> use a <term> histogram
tech,10-1-H01-1040,bq show how two standard outputs from <term> information extraction ( IE ) systems </term> - <term>
text browser </term> . We describe how this information is used in a <term> prototype system </term>
other,14-2-H01-1040,bq prototype system </term> designed to support <term> information workers </term> ' access to a <term> pharmaceutical
evaluation techniques </term> will provide information about both the <term> human language learning
other,16-2-H01-1049,bq mediate between <term> users </term> and <term> information sources </term> . We have built and will
their logistics system to place a supply or information request . The request is passed to a <term>
measure(ment),7-2-H01-1070,bq rule-reduction algorithm </term> applying <term> mutual information </term> to reduce the <term> error-correction
write a <term> topical report </term> , culling information from a large inflow of <term> multilingual
tech,16-2-P03-1009,bq SCF ) </term> distributions using the <term> Information Bottleneck </term> and <term> nearest neighbour
tech,13-2-P03-1030,bq <term> new event detection </term> as <term> information retrieval task </term> and hypothesize on
other,11-5-P03-1031,bq ambiguity </term> based on <term> statistical information </term> obtained from <term> dialogue corpora
tool,10-6-P03-1033,bq </term> are implemented in <term> Kyoto city bus information system </term> that has been developed at
tech,3-7-P03-1050,bq Task-based evaluation </term> using <term> Arabic information retrieval </term> indicates an improvement
tech,19-1-P03-1068,bq large-scale <term> acquisition of word-semantic information </term> , e.g. the construction of <term> domain-independent
other,26-4-C04-1068,bq coarse-level <term> clustering </term> and simple <term> information retrieval </term> . Our evaluation shows
other,27-1-C04-1112,bq maximum entropy ) </term> with <term> linguistic information </term> . Instead of building individual <term>
other,12-3-N04-1022,bq incorporate different levels of <term> linguistic information </term> from <term> word strings </term> , <term>
hide detail