A00-1028 show figures for grammars with pruning inhibited on the most . variable
A00-1022 new document vector . Several pruning or specialization heuristics
A00-2030 CKY-style dynamic programming and pruning of low probability elements .
A00-2023 overall optimal solution . This pruning reduces exponentially the total
A00-1028 to be the case indicates that pruning does not penalize difficult sentences
A00-2030 modifiers have identical labels . 9.2 Pruning threshold of the highest scoring
A00-1028 achieved with this form of grammar pruning . However , they could potentially
A00-1028 For both languages , inhibiting pruning on the most variable symbol has
A00-2016 enough data to allow generous pruning . Treebanking is done by humans
A00-1028 second line were collected by pruning the grammar based on the whole
A00-1028 order of grammar symbols . The pruning method we propose consists in
A00-2030 pruning , and only for purposes of pruning , the prior probability of each
A00-1028 simple form of corpus-based grammar pruning is evaluated experimentally on
A00-2023 specifically , because of the pruning , it depends on the number of
A00-1028 result in a speedup , without any pruning at all . To factor out the contribution
A00-2023 of the first two children and pruning the results , before considering
A00-1028 problem for this form of grammar pruning . 3 Experimental Setup The experiments
A00-1028 viewed as higher-order grammar pruning , removing not grammar rules
A00-2014 algorithm ( Young et al. , 1989 ) and pruning settings to produce a pruned
A00-2030 others are pruned . For purposes of pruning , and only for purposes of pruning
hide detail