D15-1180 stack multiple tensor - based feature mapping layers . That is , the input
D12-1065 Specifically , let 0 represent a feature mapping from nodes to RD ( for example
D09-1054 Encoding Relations We use a joint feature mapping to model the relations between
D15-1180 algebra to introduce a non-linear feature mapping that operates on nonconsecutive
D15-1180 convolution operation with other feature mappings . Indeed , we appeal to tensor
D11-1126 features . We considered alternative feature mappings in Figure 1 , finding that mapping
D15-1180 tensor Typical n − gram feature mappings where concatenated word vectors
D15-1262 Concatenation and product yield two new feature mappings , respectively : Φh , M
D09-1054 answer extraction , the joint feature mapping can be defined as follows , 2
D15-1180 sentence . 8 Conclusion We proposed a feature mapping operator for convolutional neural
E14-1067 scores during manual evaluation . 3 Features mapping content type to appropriate length
D11-1090 statistics - based and document-based feature mapping for a discriminative word segmenter
D10-1095 classification and use a joined feature mapping of an instance x and a labeling
H94-1065 we present an approach towards feature mapping by modeling the difference between
D13-1016 suitably produce a generalizable feature mapping function for domain adaptation
D15-1180 to another sequence-to-sequence feature mapping . The simplest strategy ( adopted
D09-1054 ) , respectively . We used the feature mapping Oea ( xj ) defined in Equation
D14-1101 , where ϕd ( z , y ) is a feature mapping for the discrete part of z and
D09-1054 are ( a ) definition of joint feature mapping for encoding relations , ( b
D09-1054 ) , ψn ( xj , where is a feature mapping for a given sentence and a label
hide detail