E09-1017 also very high -- over 90 % for a multi-layer perceptron classifier . We also test the
H91-1052 the normalization experiment , a multi-layer perceptron ( MLP ) was proposed to estimate
H91-1031 constraints , we have chosen to use the multi-layer perceptron ( MLP ) \ -LSB- 7 \ -RSB- . In
H90-1074 been investigating the use of multi-layer perceptrons for phonetic classification .
H94-1068 we have explored , is based on multi-layer perceptrons ( MLP ) . The MLP has 3 lay -
H90-1074 phonetic classification based on multi-layer perceptrons ( MLP ) . These experiments involve
E09-1017 Overall the best classifier is the multi-layer perceptron . On the task using all available
H92-1038 probabilities are estimated with separate multi-layer perceptrons . 2.2 . Classification Component
N10-1124 toolkit3 : decision trees , SVM , multi-layer perceptron and Naive Bayes . Combining multiple
D10-1018 investigated both finite state and multi-layer perceptron methods for the task , where
H05-1009 In our experiments , we used a multi-layer perceptron ( MLP ) consisting of 1 input
H94-1113 completed . It consists of 13 multi-layer perceptrons ( MLP ) , one for each of the
H91-1054 difference between speakers . As multi-layer perceptrons ( MLP ) can be used to approximate
D15-1277 using f ( Si ) as input . We use a multi-layer perceptron as the prediction model . The
E09-1017 , support vector machines and multi-layer perceptron perform similarly , with support
N12-2009 ) , non-linear models such as multi-layer perceptrons with more than one hidden layer
H91-1028 acceptance " or " rejection " by a multi-layer perceptron ( MLP ) . This rejection procedure
H05-1009 classifiers . 3 Neural Networks A multi-layer perceptron ( MLP ) is a feed-forward neural
H91-1054 speakers . Because of this , the multi-layer perceptron ( MLP ) with the back-propagation
E97-1005 to all output units ) , and a multi-layer perceptron with all input units connected
hide detail