W09-1124 conditional log-likelihood learning ( L-BFGS optimization ) . We used this latter method
P15-2028 cross entropy error . We use the L-BFGS optimization algorithm to optimize our objective
P06-1028 Sha and Pereira , 2003 ) with L-BFGS optimization . For MCE , we only considered
D13-1137 parameters after t iterations of the L-BFGS optimization . Our preliminary experimental
D09-1014 linear-chain CRF on the alignment - L-BFGS optimization procedure checks whether the
P14-1066 Careful implementation of the L-BFGS optimization based on the BLEU - centric objective
N09-1007 DPLVMs and CRFs . We apply the L-BFGS optimization algorithm to optimize the objective
D10-1061 parator . We use the standard L-BFGS optimization algorithm ( Liu and Nocedal ,
P14-2044 Klementiev , 2012 ) . We interleave L-BFGS optimization within sampling , as in Monte
W05-1505 the MaxEnt estimator using the L-BFGS optimization algorithms and Gaussian smoothing
W06-1643 training log-linear models with L-BFGS optimization techniques and maximize the loglikelihood
P12-1109 BILOU " encoding scheme . with L-BFGS optimization . We use the charac - ter/phoneme
P10-2028 these five runs . Also we perform L-BFGS optimization to automatically adjust the non-informative
P08-4003 of Liu and Nocedal 's ( 1989 ) L-BFGS optimization code , with a function for programmatic
hide detail