tech,17-1-H92-1095,bq spoken language understanding </term> , <term> text understanding </term> , and <term> document
tech,3-1-C04-1116,bq smaller and more robust . We present a <term> text mining method </term> for finding <term> synonymous
tech,26-3-P04-2005,bq Sense Disambiguation ( WSD ) </term> and <term> Text Summarisation </term> . Our method takes
other,2-1-C94-1026,bq homophone errors </term> . To align <term> bilingual texts </term> becomes a crucial issue recently
tech,6-1-P84-1078,bq describes <term> Paul </term> , a <term> computer text generation system </term> designed to create
other,24-1-N03-4010,bq answering capability </term> on <term> free text </term> . The demonstration will focus on
tech,15-2-N06-4001,bq researchers who are not experts in <term> text mining </term> . As evidence of its usefulness
other,10-2-A88-1001,bq heuristically-produced complete <term> sentences </term> in <term> text </term> or <term> text-to-speech form </term>
other,13-1-P82-1035,bq under the assumption that the input <term> text </term> will be in reasonably neat form ,
other,0-1-A94-1026,bq language translation </term> . <term> Japanese texts </term> frequently suffer from the <term> homophone
lr-prod,15-3-H94-1014,bq <term> word </term><term> Wall Street Journal text corpus </term> . Using the <term> BU recognition
other,12-3-C92-4207,bq </term> , which takes <term> natural language texts </term> and produces a <term> model </term> of
other,35-1-I05-4010,bq numbering system </term> in the <term> legal text hierarchy </term> . Basic methodology and
tech,8-1-C90-3072,bq have become an integral part of most <term> text processing software </term> . From different
other,11-7-H01-1042,bq six extracts of <term> translated newswire text </term> . Some of the extracts were <term>
other,6-2-C88-1044,bq </term> . We examine a broad range of <term> texts </term> to show how the distribution of <term>
papers in English , many systems to run off texts have been developed . In this paper , we
lr,1-3-P03-1050,bq training resources </term> . No <term> parallel text </term> is needed after the <term> training
other,12-4-P06-1013,bq are derived automatically from <term> raw text </term> . Experiments using the <term> SemCor
other,20-2-P01-1008,bq translations </term> of the same <term> source text </term> . Our approach yields <term> phrasal
hide detail