P10-4002 |
alternative algorithms for language
|
model integration
|
. Further training pipelines
|
D09-1037 |
further complicated by the language
|
model integration
|
. Therefore we composed each
|
P11-2072 |
complexity and allow early language
|
model integration
|
. However , it creates virtual
|
J00-3003 |
test set . For the purpose of
|
model integration
|
, the likelihoods of the Other
|
P11-1129 |
complexity of Chiang 's language
|
model integration
|
( Chiang , 2007 ) . Figure 2
|
N12-1060 |
< XN , XN > . 3.1 Language
|
Model Integration
|
The traditional phrase-based
|
N09-1026 |
additional factor of 2 . 6 Language
|
Model Integration
|
Large n-gram language models
|
P09-2036 |
consists of parsing and language
|
model integration
|
. The parsing stage builds a
|
P09-2036 |
parsing stage and a target language
|
model integration
|
stage ( Huang and Chiang , 2007
|
P10-4002 |
couple the translation , language
|
model integration
|
( which we call rescoring ) ,
|
P11-4007 |
confusing . Finally , language
|
model integration
|
with RSVP is relatively straightforward
|
P11-2070 |
language side and early language
|
model integration
|
on the target language side .
|
N09-1026 |
computational cost of language
|
model integration
|
, the efficiency of the parsing
|
D15-1073 |
different errors , which suggests that
|
model integration
|
can lead to better accuracies
|
W09-0424 |
chart-parsing , n-gram language
|
model integration
|
, beam - and cube-pruning , and
|
D10-1027 |
like Hiero or GHKM : language
|
model integration
|
overhead is the most significant
|
D13-1110 |
problems in efficient language
|
model integration
|
and requires state reduction
|
W09-0424 |
chart-parsing , ngram language
|
model integration
|
, beam - and cube-pruning , and
|
W08-0402 |
chart-parsing , m-gram language
|
model integration
|
, beam - and cube-pruning , and
|
W11-2503 |
for technical de - tails . 4.3
|
Model integration
|
We remarked above that the visual
|