P06-1090 frequency estimation , we treat all N-best phrase alignments equally . For comparison , we
P06-1090 estimation . In particular , the N-best phrase alignment described in Section 4.1 is computationally
P06-1090 possible . 6 . The consistent N-best phrase alignment are searched from all combinations
P06-1090 significantly larger space for the N-best phrase alignment . Figure 3 shows an example of
W12-3158 phrases that appear in any of the n-best phrase alignments , leaving the channel probabilities
P06-1090 input to mk - cls . We treat all N-best phrase alignments equally . Thus , the phrase alignments
P06-1090 we also implemented a different N-best phrase alignment method , where phrase pairs are
P06-1090 constraint in the search for the N-best phrase alignment ( Zens et al. , 2004 ) . The
P06-1090 cope with sparse - ness , we use N-best phrase alignments and bilingual phrase clustering
P06-1090 cope with sparseness , we use N-best phrase alignments and bilingual phrase clustering
P06-1090 target sentence . The consistent N-best phrase alignment can be obtained by using A *
P06-1090 assume this is because the proposed N-best phrase alignment method optimizes the combination
P06-1090 eter estimation method including N-best phrase alignments and bilingual phrase clustering
P06-1090 with this sparseness , we used N-best phrase alignment and bilingual phrase 4.1 N-best
P06-1090 reordering pattern that appeared in the N-best phrase alignments of the training bilingual sentences
P06-1090 phrases are extracted by using the N-best phrase alignment method described in Section 4.1
P06-1090 alignment and bilingual phrase 4.1 N-best Phrase Alignment In order to obtain the Viterbi
hide detail