D08-1034 |
development set . The code of feature
|
selection algorithm
|
is designed in Figure 3 . 1 .
|
C86-1134 |
background of the scene ? In NAOS the
|
selection algorithm
|
answering the above question
|
C02-1079 |
of merit for the best analysis
|
selection algorithm
|
. First , the acquisition of
|
D10-1024 |
Contribution ( TC ) , a feature
|
selection algorithm
|
developed for document clustering
|
D08-1034 |
, we designed a simple feature
|
selection algorithm
|
to calibrate features for each
|
D10-1061 |
baseline ( seed ) SMT system . The
|
selection algorithm
|
is then trained to choose , from
|
D10-1021 |
second technique is a new feature
|
selection algorithm
|
which uses an ensemble of feature
|
A97-1032 |
the human judge . ( Since the
|
selection algorithm
|
produces only a single tiling
|
C86-1134 |
reaL-wnrld events ( visual data ) . The
|
selection algorithms
|
are based on low-level , verbinherent
|
C02-1079 |
addition to verbs , the best analysis
|
selection algorithms
|
could also take advantage of
|
C02-1079 |
The methods of the best analysis
|
selection algorithm
|
described in this paper show
|
D10-1044 |
implemented a very simple sentence
|
selection algorithm
|
in which parallel sentence pairs
|
D10-1024 |
provided that a reasonable feature
|
selection algorithm
|
is employed . The LDA topic model
|
D09-1004 |
system . 2 ) Using a greedy feature
|
selection algorithm
|
, a large-scale feature engineering
|
C86-1134 |
drives . According to the deep ease
|
selection algorithm
|
a DIRECTION and LOCATIVE should
|
D08-1034 |
optimal solution . The feature
|
selection algorithm
|
is as follows . Each time we
|
D10-1061 |
development set D for training the
|
selection algorithm
|
and for tuning the SMT . •
|
C92-4189 |
fiiml modification to the sense
|
selection algorithm
|
mcleased pelfolmalice by 10 %
|
D10-1070 |
, we adopt the greedy feature
|
selection algorithm
|
as described in Jiang and Ng
|
D09-1133 |
. We adopt the greedy feature
|
selection algorithm
|
as described in Jiang and Ng
|