other,18-6-P95-1034,ak generators </term> and enhance their <term> portability </term> , even when perfect <term> knowledge
tech,9-4-P95-1034,ak attack these problems , we have built a <term> hybrid generator </term> , in which gaps in <term> symbolic
other,11-1-P95-1034,ak the integration of vast amounts of <term> knowledge </term> : lexical , grammatical , and conceptual
other,6-3-P95-1034,ak missing . It must also be robust against <term> incomplete or inaccurate inputs </term> . To attack these problems , we have
tech,21-4-P95-1034,ak symbolic knowledge </term> are filled by <term> statistical methods </term> . We describe <term> algorithms </term>
tech,14-6-P95-1034,ak </term> can be used to simplify current <term> generators </term> and enhance their <term> portability
tech,2-5-P95-1034,ak statistical methods </term> . We describe <term> algorithms </term> and show experimental results . We
other,16-4-P95-1034,ak generator </term> , in which gaps in <term> symbolic knowledge </term> are filled by <term> statistical methods
other,13-2-P95-1034,ak to operate well even when pieces of <term> knowledge </term> are missing . It must also be robust
model,5-6-P95-1034,ak results . We also discuss how the <term> hybrid generation model </term> can be used to simplify current <term>
other,23-6-P95-1034,ak portability </term> , even when perfect <term> knowledge </term> is in principle obtainable . Aggregating
tech,1-2-P95-1034,ak , grammatical , and conceptual . A <term> robust generator </term> must be able to operate well even
tech,0-1-P95-1034,ak improvement </term> over the original tests . <term> Large-scale natural language generation </term> requires the integration of vast
hide detail