tech,11-2-H01-1041,bq The <term> CCLINC Korean-to-English translation system </term> consists of two <term> core modules </term> , <term> language understanding and generation modules </term> mediated by a <term> language neutral meaning representation </term> called a <term> semantic frame </term> .
other,22-1-H01-1042,bq The purpose of this research is to test the efficacy of applying <term> automated evaluation techniques </term> , originally devised for the <term> evaluation </term> of <term> human language learners </term> , to the <term> output </term> of <term> machine translation ( MT ) systems </term> .
tech,3-2-H01-1049,bq We integrate a <term> spoken language understanding system </term> with <term> intelligent mobile agents </term> that mediate between <term> users </term> and <term> information sources </term> .
other,13-3-H01-1055,bq The issue of <term> system response </term> to <term> users </term> has been extensively studied by the <term> natural language generation community </term> , though rarely in the context of <term> dialog systems </term> .
model,11-1-H01-1058,bq In this paper , we address the problem of combining several <term> language models ( LMs ) </term> .
tech,17-1-H01-1070,bq This paper proposes a practical approach employing <term> n-gram models </term> and <term> error-correction rules </term> for <term> Thai key prediction </term> and <term> Thai-English language identification </term> .
other,10-5-P01-1007,bq The <term> non-deterministic parsing choices </term> of the <term> main parser </term> for a <term> language L </term> are directed by a <term> guide </term> which uses the <term> shared derivation forest </term> output by a prior <term> RCL parser </term> for a suitable <term> superset of L.
tech,6-1-P01-1008,bq While <term> paraphrasing </term> is critical both for <term> interpretation and generation of natural language </term> , current systems use manual or semi-automatic methods to collect <term> paraphrases </term> .
tech,14-2-P01-1009,bq These <term> words </term> appear frequently enough in <term> dialog </term> to warrant serious <term> attention </term> , yet present <term> natural language search engines </term> perform poorly on <term> queries </term> containing them .
tech,7-1-P01-1056,bq <term> Techniques for automatically training </term> modules of a <term> natural language generator </term> have recently been proposed , but a fundamental concern is whether the <term> quality </term> of <term> utterances </term> produced with <term> trainable components </term> can compete with <term> hand-crafted template-based or rule-based approaches </term> .
other,11-4-N03-1001,bq The <term> classification accuracy </term> of the <term> method </term> is evaluated on three different <term> spoken language system domains </term> .
tech,14-1-N03-1004,bq Motivated by the success of <term> ensemble methods </term> in <term> machine learning </term> and other areas of <term> natural language processing </term> , we developed a <term> multi-strategy and multi-source approach to question answering </term> which is based on combining the results from different <term> answering agents </term> searching for <term> answers </term> in multiple <term> corpora </term> .
other,9-3-N03-1017,bq Our empirical results , which hold for all examined <term> language pairs </term> , suggest that the highest levels of performance can be obtained through relatively simple means : <term> heuristic learning </term> of <term> phrase translations </term> from <term> word-based alignments </term> and <term> lexical weighting </term> of <term> phrase translations </term> .
tech,6-1-N03-2003,bq Sources of <term> training data </term> suitable for <term> language modeling </term> of <term> conversational speech </term> are limited .
model,28-1-N03-2006,bq In order to boost the <term> translation quality </term> of <term> EBMT </term> based on a small-sized <term> bilingual corpus </term> , we use an out-of-domain <term> bilingual corpus </term> and , in addition , the <term> language model </term> of an in-domain <term> monolingual corpus </term> .
model,11-3-N03-2036,bq During <term> decoding </term> , we use a <term> block unigram model </term> and a <term> word-based trigram language model </term> .
tech,11-1-N03-3010,bq In this paper , we propose a novel <term> Cooperative Model </term> for <term> natural language understanding </term> in a <term> dialogue system </term> .
tech,27-2-N03-4004,bq It gives users the ability to spend their time finding more data relevant to their task , and gives them translingual reach into other <term> languages </term> by leveraging <term> human language technology </term> .
tech,13-1-N03-4010,bq The <term> JAVELIN system </term> integrates a flexible , <term> planning-based architecture </term> with a variety of <term> language processing modules </term> to provide an <term> open-domain question answering capability </term> on <term> free text </term> .
other,13-1-P03-1005,bq This paper proposes the <term> Hierarchical Directed Acyclic Graph ( HDAG ) Kernel </term> for <term> structured natural language data </term> .
hide detail