N13-1015 |
performance using a relatively simple
|
backoff smoothing
|
method . The intuition behind
|
N01-1023 |
tag dictionary ( used within a
|
backoff smoothing
|
strategy ) for labels are not
|
D13-1143 |
For simplicity , we use stupid
|
backoff smoothing
|
( Brants et al. , 2007 ) . 3
|
P00-1048 |
the formalism of the simplified
|
backoff smoothing
|
, each probability whose ML estimate
|
P04-1008 |
statistical language model using Katz
|
backoff smoothing
|
technique ( Katz , 1987 ) . This
|
D12-1032 |
characters are estimated with
|
backoff smoothing
|
. 5 Inference The input to inference
|
J14-4002 |
using Witten-Bell smoothing , and
|
backoff smoothing
|
is achieved using failure transitions
|
P05-2004 |
a lowerorder model , a form of
|
backoff smoothing
|
for dealing with data sparsity
|
P09-2022 |
. Good-Turing discounting and
|
backoff smoothing
|
are also applied . Here , it
|
K15-1015 |
features , giving us both a form of
|
backoff smoothing
|
and twenty times faster training
|
K15-1015 |
faster training and a form of
|
backoff smoothing
|
. The resulting parser is over
|
D08-1113 |
different lengths , a form of
|
backoff smoothing
|
( Wu and Khudanpur , 2000 ) .
|
P10-1157 |
using Witten-Bell interpolated
|
backoff smoothing
|
( Bilmes and Kirchhoff , 2003
|
N06-1036 |
embedded EM training Incorporating
|
backoff smoothing
|
procedures into Bayesian networks
|
J05-4005 |
lists using MLE , together with a
|
backoff smoothing
|
schema , as described in Section
|
P08-1085 |
In all experiments , we use the
|
backoff smoothing
|
method of ( Thede and Harper
|
P06-1125 |
. In our implementation , Katz
|
Backoff smoothing
|
technique ( Katz , 1987 ) is
|
P08-1058 |
longer n-gram in such cases .
|
Backoff smoothing
|
algorithms typically request
|
P04-1021 |
transliteration units ; 2 ) The
|
backoff smoothing
|
of n-gram TM is more effective
|
E97-1056 |
analysed the relationship between
|
Backoff smoothing
|
and Memory-Based Learning and
|