other,23-9-J05-1003,bq |
The article also introduces a new
<term>
algorithm
</term>
for the
<term>
boosting approach
</term>
which takes advantage of the
<term>
sparsity of the feature space
</term>
in the
<term>
parsing data
</term>
.
|
#8884
The article also introduces a new algorithm for the boosting approach which takes advantage of the sparsity of the feature space in theparsing data. |
tech,8-10-J05-1003,bq |
Experiments show significant efficiency gains for the new
<term>
algorithm
</term>
over the obvious
<term>
implementation
</term>
of the
<term>
boosting approach
</term>
.
|
#8895
Experiments show significant efficiency gains for the newalgorithm over the obvious implementation of the boosting approach. |
other,12-10-J05-1003,bq |
Experiments show significant efficiency gains for the new
<term>
algorithm
</term>
over the obvious
<term>
implementation
</term>
of the
<term>
boosting approach
</term>
.
|
#8899
Experiments show significant efficiency gains for the new algorithm over the obviousimplementation of the boosting approach. |
tech,15-10-J05-1003,bq |
Experiments show significant efficiency gains for the new
<term>
algorithm
</term>
over the obvious
<term>
implementation
</term>
of the
<term>
boosting approach
</term>
.
|
#8902
Experiments show significant efficiency gains for the new algorithm over the obvious implementation of theboosting approach. |
tech,21-11-J05-1003,bq |
We argue that the method is an appealing alternative — in terms of both simplicity and efficiency — to work on
<term>
feature selection methods
</term>
within
<term>
log-linear ( maximum-entropy ) models
</term>
.
|
#8926
We argue that the method is an appealing alternative—in terms of both simplicity and efficiency—to work onfeature selection methods within log-linear (maximum-entropy) models. |
tech,25-11-J05-1003,bq |
We argue that the method is an appealing alternative — in terms of both simplicity and efficiency — to work on
<term>
feature selection methods
</term>
within
<term>
log-linear ( maximum-entropy ) models
</term>
.
|
#8930
We argue that the method is an appealing alternative—in terms of both simplicity and efficiency—to work on feature selection methods withinlog-linear ( maximum-entropy ) models. |
tech,8-12-J05-1003,bq |
Although the experiments in this article are on
<term>
natural language parsing ( NLP )
</term>
, the
<term>
approach
</term>
should be applicable to many other
<term>
NLP problems
</term>
which are naturally framed as
<term>
ranking tasks
</term>
, for example ,
<term>
speech recognition
</term>
,
<term>
machine translation
</term>
, or
<term>
natural language generation
</term>
.
|
#8944
Although the experiments in this article are onnatural language parsing ( NLP ), the approach should be applicable to many other NLP problems which are naturally framed as ranking tasks, for example, speech recognition, machine translation, or natural language generation. |
tech,16-12-J05-1003,bq |
Although the experiments in this article are on
<term>
natural language parsing ( NLP )
</term>
, the
<term>
approach
</term>
should be applicable to many other
<term>
NLP problems
</term>
which are naturally framed as
<term>
ranking tasks
</term>
, for example ,
<term>
speech recognition
</term>
,
<term>
machine translation
</term>
, or
<term>
natural language generation
</term>
.
|
#8952
Although the experiments in this article are on natural language parsing (NLP), theapproach should be applicable to many other NLP problems which are naturally framed as ranking tasks, for example, speech recognition, machine translation, or natural language generation. |
other,23-12-J05-1003,bq |
Although the experiments in this article are on
<term>
natural language parsing ( NLP )
</term>
, the
<term>
approach
</term>
should be applicable to many other
<term>
NLP problems
</term>
which are naturally framed as
<term>
ranking tasks
</term>
, for example ,
<term>
speech recognition
</term>
,
<term>
machine translation
</term>
, or
<term>
natural language generation
</term>
.
|
#8959
Although the experiments in this article are on natural language parsing (NLP), the approach should be applicable to many otherNLP problems which are naturally framed as ranking tasks, for example, speech recognition, machine translation, or natural language generation. |
tech,30-12-J05-1003,bq |
Although the experiments in this article are on
<term>
natural language parsing ( NLP )
</term>
, the
<term>
approach
</term>
should be applicable to many other
<term>
NLP problems
</term>
which are naturally framed as
<term>
ranking tasks
</term>
, for example ,
<term>
speech recognition
</term>
,
<term>
machine translation
</term>
, or
<term>
natural language generation
</term>
.
|
#8966
Although the experiments in this article are on natural language parsing (NLP), the approach should be applicable to many other NLP problems which are naturally framed asranking tasks, for example, speech recognition, machine translation, or natural language generation. |
tech,36-12-J05-1003,bq |
Although the experiments in this article are on
<term>
natural language parsing ( NLP )
</term>
, the
<term>
approach
</term>
should be applicable to many other
<term>
NLP problems
</term>
which are naturally framed as
<term>
ranking tasks
</term>
, for example ,
<term>
speech recognition
</term>
,
<term>
machine translation
</term>
, or
<term>
natural language generation
</term>
.
|
#8972
Although the experiments in this article are on natural language parsing (NLP), the approach should be applicable to many other NLP problems which are naturally framed as ranking tasks, for example,speech recognition, machine translation, or natural language generation. |
tech,39-12-J05-1003,bq |
Although the experiments in this article are on
<term>
natural language parsing ( NLP )
</term>
, the
<term>
approach
</term>
should be applicable to many other
<term>
NLP problems
</term>
which are naturally framed as
<term>
ranking tasks
</term>
, for example ,
<term>
speech recognition
</term>
,
<term>
machine translation
</term>
, or
<term>
natural language generation
</term>
.
|
#8975
Although the experiments in this article are on natural language parsing (NLP), the approach should be applicable to many other NLP problems which are naturally framed as ranking tasks, for example, speech recognition,machine translation, or natural language generation. |
tech,43-12-J05-1003,bq |
Although the experiments in this article are on
<term>
natural language parsing ( NLP )
</term>
, the
<term>
approach
</term>
should be applicable to many other
<term>
NLP problems
</term>
which are naturally framed as
<term>
ranking tasks
</term>
, for example ,
<term>
speech recognition
</term>
,
<term>
machine translation
</term>
, or
<term>
natural language generation
</term>
.
|
#8979
Although the experiments in this article are on natural language parsing (NLP), the approach should be applicable to many other NLP problems which are naturally framed as ranking tasks, for example, speech recognition, machine translation, ornatural language generation. |