D09-1087 reported no improvements from self-training a PCFG parser on the standard
D09-1087 ubiquitous . Early investigations on self-training for parsing have had mixed re
D08-1106 Bootstrapping Bootstrapping ( or self-training ) is a general framework for
D09-1087 this section , we discuss how self-training is applied to train a PCFG-LA
D09-1087 enhancements in Section 2 , and discuss self-training in Section 3 . We then outline
D09-1033 by Ng and Cardie ( 2003 ) in a self-training set-up . However , while they
D08-1071 particular , we compare both hints and self-training to the two baselines , and then
D09-1087 This is the approach we chose for self-training . An alternative approach is
D08-1071 helps significantly , even over self-training . We further compare the algorithms
D09-1087 's parser that the addition of self-training data helps the former parser
D09-1060 McClosky et al. ( 2006 ) presented a self-training approach for phrase structure
D08-1114 examples of SSL algorithms include self-training ( Yarowsky , 1995 ) and co-training
D09-1087 Chinese news articles are used for self-training . Since the Chinese parsers in
D09-1087 and alignment edges . <title> Self-Training PCFG Grammars with Latent Across
D09-1087 how much does it benefit from self-training ? The first question is of special
D08-1106 have described in this paper , self-training can be thought of a graph-based
D08-1090 system 's own output , known as Self-Training ( Ueffing , 2006 ) has previously
D09-1087 in the BLLIP corpus for English self-training . For the Chinese experiments
D09-1087 English and Chi - nese . With self-training , a fraction of the WSJ or CTB6
D08-1071 is relatively ignored . &#8226; Self-training performance often degrades as
hide detail