W03-1024 |
tried to find useful features by
|
feature elimination
|
. Since features are not completely
|
P04-1016 |
Cancedda et al. , 2003 ) use a
|
feature elimination
|
method based on the size of sub-sequence
|
W03-1024 |
system 's performance . Hence ,
|
feature elimination
|
is more reliable for reducing
|
N04-4002 |
predictive information lost during
|
feature elimination
|
. In order to compare the performances
|
W03-1024 |
number of features . However ,
|
feature elimination
|
takes a long time . On the other
|
W12-2008 |
was removed . Next , a recursive
|
feature elimination
|
( RFE ) based on a linear regression
|
W15-0709 |
criterion and perform recursive
|
feature elimination
|
by repeating the following three
|
W06-3401 |
subset selection using recursive
|
feature elimination
|
were carried out on the data
|
P08-1068 |
parsers as the threshold for lexical
|
feature elimination
|
( see Section 3.2 ) is varied
|
P12-1002 |
work to weight-based recursive
|
feature elimination
|
( RFE ) ( Lal et al. , 2006 )
|
W15-0709 |
using the method of recursive
|
feature elimination
|
. The resulting feature set is
|
W10-2801 |
applied : linear projections ,
|
feature elimination
|
and random approximations . The
|
W03-1024 |
resolution . We confirmed this by
|
feature elimination
|
. <title> A Maximum Entropy Chinese
|
W06-1657 |
is reminiscent of the recursive
|
feature elimination
|
procedure first proposed in the
|
W13-2236 |
seen as a weight-based backward
|
feature elimination
|
variant of Obozinski et al. (
|
S15-2073 |
features , we performed backward
|
feature elimination
|
using the supplied training and
|
W12-3005 |
the noisiest , averaging 96.2 %
|
feature elimination
|
. The classifiers trained on
|
W14-1809 |
longer n-grams and a recursive
|
feature elimination
|
( Kuhn and Johnson , 2013 , p.
|
W15-0709 |
feature elimination Recursive
|
feature elimination
|
is a greedy algorithm that relies
|
W10-2801 |
; Baroni and Lenci , 2008 ) .
|
Feature elimination
|
reduces the dimensionality by
|