W03-1724 |
rules to correct mistakes in the
|
probabilistic segmentation
|
of ambiguous substrings . This
|
H92-1031 |
He shows that incorporating a
|
probabilistic segmentation
|
model improves the performance
|
W95-0109 |
training and cost . Practical
|
probabilistic segmentation
|
models can achieve quite satisfactory
|
P00-1078 |
matching , maximal matching and
|
probabilistic segmentation
|
had been applied in the early
|
W03-1724 |
If no rule is applicable , its
|
probabilistic segmentation
|
is retained . For the bakeoff
|
P08-1084 |
frequent strings . Our work builds on
|
probabilistic segmentation
|
approaches such as Morfessor
|
W03-1724 |
the language model is trained ,
|
probabilistic segmentation
|
can not avoid mistakes on ambiguous
|
N03-1018 |
may not be words . Therefore , a
|
probabilistic segmentation
|
model that accommodates word
|
W10-1760 |
, ( Brent , 1999 ) proposes a
|
probabilistic segmentation
|
model based on unigram word distri
|
W03-1724 |
detect the discrepancies of our
|
probabilistic segmentation
|
and the standard segmentation
|
W15-1108 |
significant interest in the early
|
probabilistic segmentation
|
strategies infants use ( Brent
|
W03-1724 |
general-purpose ngram model for
|
probabilistic segmentation
|
and a case - or example-based
|
W03-1724 |
( also denoted as cn1 ) , its
|
probabilistic segmentation
|
into a word sequence w1w2 ·
|
N09-1040 |
lexical chains . TEXTSEG employs a
|
probabilistic segmentation
|
objective that is similar to
|
W03-1724 |
this side-effect : ( 1 ) retrain
|
probabilistic segmentation
|
-- a conservative strategy ;
|