D13-1002 Moschitti , 2006 ) . The use of convolution kernels allows us to do away with the
D10-1100 the data ) . Therefore , we use convolution kernels with a linear learning machine
D10-1100 dimensional space . Moreover , Convolution Kernels ( first introduced by Haussler
D13-1002 vector machines ( SVM ) paired with convolution kernels . Experiments show that our proposal
J15-1010 This is exactly what happens in convolution kernels ( Haussler 1999 ) . K is usually
E14-1023 vectors and tree structures . We use convolution kernels ( Haussler , 1999 ) that make
D11-1096 the best syntactic paradigm for convolution kernels . Most importantly , the role
J15-1010 strong link between CDSMs and convolution kernels ( Haussler 1999 ) , which act
J15-1010 link between these models and convolution kernels . 1 . Introduction Distributional
D09-1143 investigating the effectiveness of convolution kernels adapted to syntactic parse trees
N06-1037 extraction . To our knowledge , convolution kernels have not been explored for relation
D09-1143 Wang , 2008 ) . These are not convolution kernels and produce a much lower number
J15-1010 connection between CDSMs and semantic convolution kernels . This link suggests that insights
D11-1096 Structured Lexical Similarity via Convolution Kernels on Dependency Trees </title>
J15-1010 integrated in the development of convolution kernels , with all the benefits offered
D09-1012 approach to feature selection for convolution kernels based on &#967;2-driven relevance
E06-1015 the above tree kernels are not convolution kernels as those proposed in this article
D10-1100 Programming tech - niques . Therefore , Convolution Kernels alleviate the need of feature
N06-1037 generating the chunking tree . 1 Convolution kernels were proposed as a concept of
D09-1143 are examples of the wellknown convolution kernels used in many NLP ap - plications
hide detail