D12-1095 |
parser , usually employing a heavy
|
pruning strategy
|
. Then the goal of a forest reranker
|
D11-1109 |
We further present an effective
|
pruning strategy
|
to reduce the search space of
|
D09-1038 |
frequency rules are filtered out . The
|
pruning strategy
|
is similar to the cube pruning
|
D13-1161 |
the scoring function used by the
|
pruning strategy
|
to take advantage of all features
|
D11-1109 |
this section , we introduce two
|
pruning strategies
|
to constrain the search space
|
D11-1127 |
find that with carefully designed
|
pruning strategies
|
, HiFST can match the performance
|
A00-3002 |
multi-level chart parser with a radical
|
pruning strategy
|
to the captioning domain . 5
|
D13-1032 |
over POS candidates and apply our
|
pruning strategy
|
. In a second step we expand
|
D12-1076 |
shows that our co-occurrence based
|
pruning strategy
|
can help render the semantic
|
A00-3002 |
retained . Although the original
|
pruning strategy
|
resulted in many reasonable parses
|
D10-1004 |
both cases , we employed the same
|
pruning strategy
|
as Martins et al. ( 2009 ) .
|
D13-1022 |
called bit-strings . A common beam
|
pruning strategy
|
is to group together items into
|
C04-1059 |
and restricted by the applied
|
pruning strategy
|
. Ignoring word order , the hypothesis
|
A00-3002 |
parsing . As it was expected , the
|
pruning strategy
|
resulted in a significant reduction
|
D09-1123 |
is discarded . Therefore , this
|
pruning strategy
|
maintains only fully connected
|
D14-1042 |
cases , our syntactic and semantic
|
pruning strategy
|
increased performance ( up to
|
D09-1106 |
training corpus , such global
|
pruning strategy
|
usually leads to very large disk
|
D12-1044 |
able . We therefore use a hybrid
|
pruning strategy
|
: each word 's set of potential
|
A88-1013 |
data seem to indicate that this
|
pruning strategy
|
is not unreasonable , particularly
|
D09-1123 |
and thirdly , by modifying the
|
pruning strategy
|
to handle the large search space
|