tech,10-1-P99-1022,ak |
method of generating and applying
<term>
|
hierarchical , dynamic topic-based language models
|
</term>
. It proposes and evaluates new
<term>
|
#30624
This paper presents a novel method of generating and applyinghierarchical , dynamic topic-based language models. |
|
tech,5-2-P99-1022,ak |
</term>
. It proposes and evaluates new
<term>
|
cluster generation
|
</term>
,
<term>
hierarchical smoothing
</term>
|
#30636
It proposes and evaluates newcluster generation, hierarchical smoothing and adaptive topic-probability estimation techniques. |
|
tech,8-2-P99-1022,ak |
new
<term>
cluster generation
</term>
,
<term>
|
hierarchical smoothing
|
</term>
and
<term>
adaptive topic-probability
|
#30639
It proposes and evaluates new cluster generation,hierarchical smoothing and adaptive topic-probability estimation techniques. |
|
other,5-3-P99-1022,ak |
These combined models help capture
<term>
|
long-distance lexical dependencies
|
</term>
. Experiments on the
<term>
Broadcast
|
#30652
These combined models help capturelong-distance lexical dependencies. |
|
lr-prod,3-4-P99-1022,ak |
dependencies
</term>
. Experiments on the
<term>
|
Broadcast News corpus
|
</term>
show significant improvement in
<term>
|
#30659
Experiments on theBroadcast News corpus show significant improvement in perplexity (10.5% overall and 33.5% on target vocabulary). |
|
measure(ment),10-4-P99-1022,ak |
</term>
show significant improvement in
<term>
|
perplexity
|
</term>
( 10.5 % overall and 33.5 % on
<term>
|
#30666
Experiments on the Broadcast News corpus show significant improvement inperplexity (10.5% overall and 33.5% on target vocabulary). |
|
other,19-4-P99-1022,ak |
</term>
( 10.5 % overall and 33.5 % on
<term>
|
target vocabulary
|
</term>
) . In this paper we describe a systematic
|
#30675
Experiments on the Broadcast News corpus show significant improvement in perplexity (10.5% overall and 33.5% ontarget vocabulary). |
|