W06-1609 translation model was optimized with the simplex algorithm by maximizing over the BLEU score
D08-1076 score . A weakness of the Downhill Simplex algorithm is , however , its decreasing
D10-1059 N-best lists using the Downhill Simplex Algorithm ( Press , 2007 ) . But the Downhill
J07-1003 development corpus using the Downhill Simplex algorithm . This combination using the
N10-1129 we run MERT with the Downhill Simplex algorithm on the MT04 dataset . For all
N04-1033 , 2003 ) . We use the Downhill Simplex algorithm from ( Press et al. , 2002 )
J07-1003 development corpus using the Downhill Simplex algorithm ( Press et al. 2002 ) . With
E09-1026 solve it efficiently using the simplex algorithm . We evaluate our algorithm by
J07-1003 This is done using the Downhill Simplex algorithm ( Press et al. 2002 ) . The resulting
J07-1003 achieved . Again , the Downhill Simplex algorithm is used for optimization . •
D08-1076 N-best lists using the Downhill Simplex algorithm ( Press et al. , 2007 , p. 503
P11-2068 weights were obtained with the simplex algorithm ( Nelder and Mead , 1965 ) to
D10-1059 Press , 2007 ) . But the Downhill Simplex Algorithm loses its robustness as the dimension
W06-2601 showed that training with the Simplex algorithm ( Press et al. , 1988 ) converges
W05-0827 only two parameters ) with the SIMPLEX algorithm ( see Table 2 ) . Posterior probability
E09-1026 efficiently . In our study , we used the simplex algorithm ( Dantzig , 1963 ) . We generate
N09-2006 1 , we found most of the time simplex algorithms perform better than MER in these
N09-2006 and fixed time schedule , both simplex algorithms can run with more random restarts
D11-1022 performing configuration : the simplex algorithm applied to the dual LP . We observe
N04-1033 experi - ments , the Downhill Simplex algorithm converged after about 200 iterations
hide detail