H91-1020 additional constraints in the decoding procedure . For a given sentence the decoded
J14-2005 Learning The use of a coarse-to-fine decoding procedure affects how we learn the parameters
D13-1026 ) . Following the traditional decoding procedure , we resort to the cube-pruning
H93-1019 acoustic likelihood of the set . The decoding procedure is efficiently implemented by
D12-1041 situations that could be met in decoding procedures . But actually , plain structures
D09-1147 weights for downstream consensus decoding procedures . An unexpected second benefit
D09-1147 , we adopt the fast consensus decoding procedure of DeNero et al. ( 2009 ) , which
H93-1019 likelihood is then hypothesized . This decoding procedure can efficiently be implemented
E09-1020 Melamed , 2000 ) . The training and decoding procedures we propose are however different
D13-1107 the details of the tuning and decoding procedures in the MTL-based approach . We
D15-1142 < eI1 , cJ1 , aK1 > , the decoding procedure will proceed in a left-right
D12-1041 situations that might be met in decoding procedures . We can see from Table 1 that
D12-1041 cases that could be met in real decoding procedures . To overcome this drawback ,
D10-1019 of the hybrid CRFs training and decoding procedures is higher than that of pipeline
D12-1108 this paper , we have presented a decoding procedure for phrase-based SMT that makes
D13-1083 which assumes an overlap of the decoding procedure and the parameter optimization
D14-1071 iteration number to 30 . To make the decoding procedure computable , we kept only the
D15-1287 sequentially . And due to the way the decoding procedure of PBSMT is defined , it is natural
H94-1028 search error , which means that our decoding procedure did not yield the fi that maximizes
D11-1044 that the efficient left-to-right decoding procedure of phrase-based translation could
hide detail