This paper presents a novel method of generating and applying
<term>
hierarchical , dynamic topic-based language models
</term>
.
#30624This paper presents a novel method of generating and applyinghierarchical , dynamic topic-based language models.
tech,5-2-P99-1022,ak
It proposes and evaluates new
<term>
cluster generation
</term>
,
<term>
hierarchical smoothing
</term>
and
<term>
adaptive topic-probability estimation techniques
</term>
.
#30636It proposes and evaluates newcluster generation, hierarchical smoothing and adaptive topic-probability estimation techniques.
tech,8-2-P99-1022,ak
It proposes and evaluates new
<term>
cluster generation
</term>
,
<term>
hierarchical smoothing
</term>
and
<term>
adaptive topic-probability estimation techniques
</term>
.
#30639It proposes and evaluates new cluster generation,hierarchical smoothing and adaptive topic-probability estimation techniques.
tech,11-2-P99-1022,ak
It proposes and evaluates new
<term>
cluster generation
</term>
,
<term>
hierarchical smoothing
</term>
and
<term>
adaptive topic-probability estimation techniques
</term>
.
#30642It proposes and evaluates new cluster generation, hierarchical smoothing andadaptive topic-probability estimation techniques.
other,5-3-P99-1022,ak
These combined models help capture
<term>
long-distance lexical dependencies
</term>
.
#30652These combined models help capturelong-distance lexical dependencies.
lr-prod,3-4-P99-1022,ak
Experiments on the
<term>
Broadcast News corpus
</term>
show significant improvement in
<term>
perplexity
</term>
( 10.5 % overall and 33.5 % on
<term>
target vocabulary
</term>
) .
#30659Experiments on theBroadcast News corpus show significant improvement in perplexity (10.5% overall and 33.5% on target vocabulary).
measure(ment),10-4-P99-1022,ak
Experiments on the
<term>
Broadcast News corpus
</term>
show significant improvement in
<term>
perplexity
</term>
( 10.5 % overall and 33.5 % on
<term>
target vocabulary
</term>
) .
#30666Experiments on the Broadcast News corpus show significant improvement inperplexity (10.5% overall and 33.5% on target vocabulary).
other,19-4-P99-1022,ak
Experiments on the
<term>
Broadcast News corpus
</term>
show significant improvement in
<term>
perplexity
</term>
( 10.5 % overall and 33.5 % on
<term>
target vocabulary
</term>
) .
#30675Experiments on the Broadcast News corpus show significant improvement in perplexity (10.5% overall and 33.5% ontarget vocabulary).