tech,0-1-P95-1034,ak improvement </term> over the original tests . <term> Large-scale natural language generation </term> requires the integration of vast
tech,1-2-P95-1034,ak , grammatical , and conceptual . A <term> robust generator </term> must be able to operate well even
other,6-3-P95-1034,ak missing . It must also be robust against <term> incomplete or inaccurate inputs </term> . To attack these problems , we have
tech,9-4-P95-1034,ak attack these problems , we have built a <term> hybrid generator </term> , in which gaps in <term> symbolic
other,16-4-P95-1034,ak generator </term> , in which gaps in <term> symbolic knowledge </term> are filled by <term> statistical methods
tech,21-4-P95-1034,ak symbolic knowledge </term> are filled by <term> statistical methods </term> . We describe <term> algorithms </term>
tech,2-5-P95-1034,ak statistical methods </term> . We describe <term> algorithms </term> and show experimental results . We
model,5-6-P95-1034,ak results . We also discuss how the <term> hybrid generation model </term> can be used to simplify current <term>
tech,14-6-P95-1034,ak </term> can be used to simplify current <term> generators </term> and enhance their <term> portability
hide detail