tool,30-3-C04-1035,bq |
We then use the
<term>
predicates
</term>
of such
<term>
clauses
</term>
to create a set of
<term>
domain independent features
</term>
to annotate an input
<term>
dataset
</term>
, and run two different
<term>
machine learning algorithms
</term>
:
<term>
SLIPPER
</term>
, a
<term>
rule-based learning algorithm
</term>
, and
<term>
TiMBL
</term>
, a
<term>
memory-based system
</term>
.
|
#5212
We then use the predicates of such clauses to create a set of domain independent features to annotate an input dataset, and run two different machine learning algorithms:SLIPPER, a rule-based learning algorithm, and TiMBL, a memory-based system. |
tech,33-3-C04-1035,bq |
We then use the
<term>
predicates
</term>
of such
<term>
clauses
</term>
to create a set of
<term>
domain independent features
</term>
to annotate an input
<term>
dataset
</term>
, and run two different
<term>
machine learning algorithms
</term>
:
<term>
SLIPPER
</term>
, a
<term>
rule-based learning algorithm
</term>
, and
<term>
TiMBL
</term>
, a
<term>
memory-based system
</term>
.
|
#5215
We then use the predicates of such clauses to create a set of domain independent features to annotate an input dataset, and run two different machine learning algorithms: SLIPPER, arule-based learning algorithm, and TiMBL, a memory-based system. |
tech,41-3-C04-1035,bq |
We then use the
<term>
predicates
</term>
of such
<term>
clauses
</term>
to create a set of
<term>
domain independent features
</term>
to annotate an input
<term>
dataset
</term>
, and run two different
<term>
machine learning algorithms
</term>
:
<term>
SLIPPER
</term>
, a
<term>
rule-based learning algorithm
</term>
, and
<term>
TiMBL
</term>
, a
<term>
memory-based system
</term>
.
|
#5223
We then use the predicates of such clauses to create a set of domain independent features to annotate an input dataset, and run two different machine learning algorithms: SLIPPER, a rule-based learning algorithm, and TiMBL, amemory-based system. |
tech,4-1-C04-1035,bq |
This paper presents a
<term>
machine learning
</term>
approach to bare
<term>
sluice disambiguation
</term>
in
<term>
dialogue
</term>
.
|
#5153
This paper presents amachine learning approach to bare sluice disambiguation in dialogue. |
tool,38-3-C04-1035,bq |
We then use the
<term>
predicates
</term>
of such
<term>
clauses
</term>
to create a set of
<term>
domain independent features
</term>
to annotate an input
<term>
dataset
</term>
, and run two different
<term>
machine learning algorithms
</term>
:
<term>
SLIPPER
</term>
, a
<term>
rule-based learning algorithm
</term>
, and
<term>
TiMBL
</term>
, a
<term>
memory-based system
</term>
.
|
#5220
We then use the predicates of such clauses to create a set of domain independent features to annotate an input dataset, and run two different machine learning algorithms: SLIPPER, a rule-based learning algorithm, andTiMBL, a memory-based system. |
model,15-2-C04-1035,bq |
We extract a set of
<term>
heuristic principles
</term>
from a corpus-based sample and formulate them as
<term>
probabilistic Horn clauses
</term>
.
|
#5178
We extract a set of heuristic principles from a corpus-based sample and formulate them asprobabilistic Horn clauses. |
tech,9-1-C04-1035,bq |
This paper presents a
<term>
machine learning
</term>
approach to bare
<term>
sluice disambiguation
</term>
in
<term>
dialogue
</term>
.
|
#5158
This paper presents a machine learning approach to baresluice disambiguation in dialogue. |
tech,1-4-C04-1035,bq |
Both
<term>
learners
</term>
perform well , yielding similar
<term>
success rates
</term>
of approx 90 % .
|
#5227
Bothlearners perform well, yielding similar success rates of approx 90%. |
tech,26-3-C04-1035,bq |
We then use the
<term>
predicates
</term>
of such
<term>
clauses
</term>
to create a set of
<term>
domain independent features
</term>
to annotate an input
<term>
dataset
</term>
, and run two different
<term>
machine learning algorithms
</term>
:
<term>
SLIPPER
</term>
, a
<term>
rule-based learning algorithm
</term>
, and
<term>
TiMBL
</term>
, a
<term>
memory-based system
</term>
.
|
#5208
We then use the predicates of such clauses to create a set of domain independent features to annotate an input dataset, and run two differentmachine learning algorithms: SLIPPER, a rule-based learning algorithm, and TiMBL, a memory-based system. |
other,12-1-C04-1035,bq |
This paper presents a
<term>
machine learning
</term>
approach to bare
<term>
sluice disambiguation
</term>
in
<term>
dialogue
</term>
.
|
#5161
This paper presents a machine learning approach to bare sluice disambiguation indialogue. |
lr,20-3-C04-1035,bq |
We then use the
<term>
predicates
</term>
of such
<term>
clauses
</term>
to create a set of
<term>
domain independent features
</term>
to annotate an input
<term>
dataset
</term>
, and run two different
<term>
machine learning algorithms
</term>
:
<term>
SLIPPER
</term>
, a
<term>
rule-based learning algorithm
</term>
, and
<term>
TiMBL
</term>
, a
<term>
memory-based system
</term>
.
|
#5202
We then use the predicates of such clauses to create a set of domain independent features to annotate an inputdataset, and run two different machine learning algorithms: SLIPPER, a rule-based learning algorithm, and TiMBL, a memory-based system. |
other,5-2-C04-1035,bq |
We extract a set of
<term>
heuristic principles
</term>
from a corpus-based sample and formulate them as
<term>
probabilistic Horn clauses
</term>
.
|
#5168
We extract a set ofheuristic principles from a corpus-based sample and formulate them as probabilistic Horn clauses. |
other,13-3-C04-1035,bq |
We then use the
<term>
predicates
</term>
of such
<term>
clauses
</term>
to create a set of
<term>
domain independent features
</term>
to annotate an input
<term>
dataset
</term>
, and run two different
<term>
machine learning algorithms
</term>
:
<term>
SLIPPER
</term>
, a
<term>
rule-based learning algorithm
</term>
, and
<term>
TiMBL
</term>
, a
<term>
memory-based system
</term>
.
|
#5195
We then use the predicates of such clauses to create a set ofdomain independent features to annotate an input dataset, and run two different machine learning algorithms: SLIPPER, a rule-based learning algorithm, and TiMBL, a memory-based system. |
other,13-5-C04-1035,bq |
The results show that the
<term>
features
</term>
in terms of which we formulate our
<term>
heuristic principles
</term>
have significant
<term>
predictive power
</term>
, and that
<term>
rules
</term>
that closely resemble our
<term>
Horn clauses
</term>
can be learnt automatically from these
<term>
features
</term>
.
|
#5253
The results show that the features in terms of which we formulate ourheuristic principles have significant predictive power, and that rules that closely resemble our Horn clauses can be learnt automatically from these features. |
model,27-5-C04-1035,bq |
The results show that the
<term>
features
</term>
in terms of which we formulate our
<term>
heuristic principles
</term>
have significant
<term>
predictive power
</term>
, and that
<term>
rules
</term>
that closely resemble our
<term>
Horn clauses
</term>
can be learnt automatically from these
<term>
features
</term>
.
|
#5267
The results show that the features in terms of which we formulate our heuristic principles have significant predictive power, and that rules that closely resemble ourHorn clauses can be learnt automatically from these features. |
other,17-5-C04-1035,bq |
The results show that the
<term>
features
</term>
in terms of which we formulate our
<term>
heuristic principles
</term>
have significant
<term>
predictive power
</term>
, and that
<term>
rules
</term>
that closely resemble our
<term>
Horn clauses
</term>
can be learnt automatically from these
<term>
features
</term>
.
|
#5257
The results show that the features in terms of which we formulate our heuristic principles have significantpredictive power, and that rules that closely resemble our Horn clauses can be learnt automatically from these features. |
measure(ment),7-4-C04-1035,bq |
Both
<term>
learners
</term>
perform well , yielding similar
<term>
success rates
</term>
of approx 90 % .
|
#5233
Both learners perform well, yielding similarsuccess rates of approx 90%. |
other,7-3-C04-1035,bq |
We then use the
<term>
predicates
</term>
of such
<term>
clauses
</term>
to create a set of
<term>
domain independent features
</term>
to annotate an input
<term>
dataset
</term>
, and run two different
<term>
machine learning algorithms
</term>
:
<term>
SLIPPER
</term>
, a
<term>
rule-based learning algorithm
</term>
, and
<term>
TiMBL
</term>
, a
<term>
memory-based system
</term>
.
|
#5189
We then use the predicates of suchclauses to create a set of domain independent features to annotate an input dataset, and run two different machine learning algorithms: SLIPPER, a rule-based learning algorithm, and TiMBL, a memory-based system. |
other,22-5-C04-1035,bq |
The results show that the
<term>
features
</term>
in terms of which we formulate our
<term>
heuristic principles
</term>
have significant
<term>
predictive power
</term>
, and that
<term>
rules
</term>
that closely resemble our
<term>
Horn clauses
</term>
can be learnt automatically from these
<term>
features
</term>
.
|
#5262
The results show that the features in terms of which we formulate our heuristic principles have significant predictive power, and thatrules that closely resemble our Horn clauses can be learnt automatically from these features. |
other,5-5-C04-1035,bq |
The results show that the
<term>
features
</term>
in terms of which we formulate our
<term>
heuristic principles
</term>
have significant
<term>
predictive power
</term>
, and that
<term>
rules
</term>
that closely resemble our
<term>
Horn clauses
</term>
can be learnt automatically from these
<term>
features
</term>
.
|
#5245
The results show that thefeatures in terms of which we formulate our heuristic principles have significant predictive power, and that rules that closely resemble our Horn clauses can be learnt automatically from these features. |